NewsWorksSoftwareTextBioContact
background image

Blog archive for August 2010

Web site updates

August 3, 2010

Over the last week or so I have been doing substantial work on this web site, most of it relating to documentation of past artistic projects.

With a few exceptions the projects I have been involved with are now thoroughly documented through video, images, audio and/or text, depending on the project and the documentation I have of it. Videos are hosted at Vimeo and embedded in the various pages.

I have also improved the Rails application implementation used for documentation, adding categories to the projects and improved the front page.

In addition I have added a sidebar to the blog section for viewing the blog archive by month. Together with the general search functionality that I added earlier this year, I hope this helps making content at the site more accessible. At least it’s easier for me myself to find whenever I need to look something up.

A few tasks remain on the todo list, most notably to come up with a non-flash solution for viewing embedded video on on iPhone and iPad. I have found possible solutions online for this, but haven’t had the time yet to try it out.

David Rokeby, Very Nervous System (1983-)

August 7, 2010

Vnspots

Fondation Daniel Langlois has been assembling extensive documentation of Very Nervous System by David Rokeby:

It is a particularly interesting case study for two reasons. Firstly, it offers an unmatched demonstration of the importance of experience in media art. Very Nervous System is essentially an empty room until someone walks in and activates it. It is a work that is brought into being very literally through experience.

Secondly, it is a seminal work in the history of media art, with a lifespan of more than 28 years. Its celebrity and longevity pose some particularly interesting questions about documentation and contextualisation of media artworks over time and through change.

This documentary collection was compiled by Caitlin Jones and Lizzie Muller while the piece was being shown at Ars Electronica 2009 in Linz (Austria):

http://www.fondation-langlois.org/html/e/page.php?NumPage=2186

AUVI objecs for Max5

August 7, 2010

Maxmspscreensnapz008

AUVI, a set of video processing externals for Max, was first developed back in the Nato.0+55 heydays by Kurt Ralske. Recently an update for Max5 was released, and the objects are now available for free for non-commercial uses.

This is indeed a welcome return!

Custom Mac application using Quartz Composer

August 8, 2010

Cubeappscreensnapz001

On my way back from Guernsey earlier this summer I picked up an iPad at Gatwick. The iPad is still not available in Norway. It was mostly out of curiosity for the device, and to see how the difference in size alters the affordance as compared to the smaller iPhone. I am constantly reading and looking up PDF documents, mainly manuals and technical literature, and I expected the iPad to be preferable for this as compared to my laptop (stuck at the desk) or the iPhone (to small). In addition the work carried out within the French Virage research project has sparked my curiosity for multi-touch interfaces as control devices for live performance.

I have used it extensively over the summer, mostly for reading. I also have found it useful for reading papers online, and tend to read more international papers than Norwegian these days. In addition I find lots of useful learning material as video podcasts as well as at iTunesU.

Over the summer I have started looking into development for the Mac platform(s). A long term goal of this is to be able to explore the iPhone as a device for portable and geo-located sound art works.

In June I used Quartz Composer to develop a playback solution for the video screening program of the Contemporary Artists from South Africa at Galleri 3.14. For the screening program we had a mix of 4:3 and 16:9, PAL and NTSC, SD and HD videos to be played back as a synchronized dual video projection. Having abandoned using the Dave Jones owned by KHIB for playback, as all videos would have to be reduced to SD resolution, I ended up developing a solution using Quartz Composer for playback from two mini-macs using OSC for communicating sync information.

The application worked for us. It is not quite ready for prime time as a public release, but definitively sparked my interest in learning more on Quartz Composer, and how it might be integrated into custom applications.

The screenshot above is a simple proof of concept of a standalone application rendering a simple Quartz composition. Pardon the pathetic visual result, that was not the main focus this time.

T-shirt

August 10, 2010

I suggested this slogan in a discussion on new T-shirts at the Microsound mailing list some years ago. Not everyone understood the humor/irony, but apparently someone actually went ahead and made it.

Huh?

August 13, 2010

Huh

Modality workshop – Invitation

August 17, 2010

Thuesday 28. September – Friday 1. Octobre BEK invites musicians, programmers and composers to a four-days workshop focusing on the software MODALITY. It’s free to attend, but you have to cover your own travel and stay.

Modality is a tool for building electro-instruments in SuperCollider under construction by Jeff Carey og Bjørnar Habbestad. The workshop will elucidate this development through presentations, discussions and open code-sessions.

Invited participants are Jeff Carey, Alberto di Campo, Wouter Snoei and Marije Baalman, Trond Lossius and Bjørnar Habbestad. themes to be covered are: Modal Control strategies, sensor input, DBAP spatialisation, Proxy Space, Quark, Mapping strategies etc.

The workshop is open for participants skilled in working with SuperCollider.

Contact BEK v/ Lars Ove Toft for registration: lars.ove.toftATbek.no

Combining spatialisation and panoramic video

August 20, 2010

Getting back to daily work routines again after the summer vacation also implies getting back to the long bus drive back home at the end of the working days.

Where the laptop was to large and the iPhone to small to be used on the bus, the iPad really shines when it comes to enabling reading. The bus ride back home yesterday was spent reading two chapters from a draft version of the PhD thesis of fellow Jamoma developer and spatialisation brother in arms Nils Peters. One of them discuss a range of artistic and research projects using ViMIC.

One of the productions discussed, THERE IS STILL TIME..BROTHER by The Wooster Group use a 360-degree projection screen, the audience surrounded by the film’s bewildering narrative space, where the action can only be seen and heard clearly through a virtual peephole that scans the circle, controlled by a member of the audience.

The combination of spatialisation and 360° panoramic video is interesting to me, as I am currently working on a similar project together with Karen Kipphoff for the upcoming ICCI 360 Festival in Plymouth.

Vimeo Universal Player

August 20, 2010

As one of three new features added in the last few days, Vimeo now have a new embed solution that makes embedded videos play on non-flash devices as well (iPhone, iPad, …). The ability to have videos viewable on iDevices is a plus-only feature, meaning that only videos by Plus members are supported.

Now on to modify the Rails code for this site so that all videos part of documentation of past works show up on iPad/iPhone. Shouldn’t take more than a few minutes.

Update: DONE. All video documentation of past works is now viewable on mobile devices as well. Thanks, Vimeo!

Embedding Quartz Compositions in web pages

August 22, 2010

This will only work in Safari on Mac OSX, but it would be possible to create fallback methods for other browsers by rendering the composition to a movie, and set up a combo of JavaScript and HTML to load one or the other depending on platform and browser:


Screensaver by Aaron Welton

<embed type="application/x-quartzcomposer"
       src="my_composition.qtz"
       id="myComposition"
       width="300px"
       height="150px"
       opaque="false">
</embed>

Note to self

August 22, 2010

Screen and speaker setup for ICCI 360:

Screen_diagram

Jamoma controlling Web Service

August 22, 2010

Maxmspscreensnapz002

Today saw a new functionality added to Jamoma. The jcom.webservice component enables controlling and monitoring Web Sharing, the service enabling the computer to function as a web server, from within Max and Jamoma.

It does so by internally calling an AppleScript that behind the scene opens the Sharing control panel, get the setting for Web Sharing, and change it if required.

One potential use is with the Fantastick app for iPhone and iPad. The app enables images to be downloaded from the web and used to build multitouch user interfaces, communicating with Max over the network.

An obvious challenge when depending on web downloads for a performance interface is the possibility of web access being unavailable in a performance situation. If the images instead are hosted from the computer running Max, they will be remain available, provided that a local wifi network can be set up, and Web Sharing is enabled on the Max computer.

iPad as auditory scene control interface

August 23, 2010

Over the last few days I have been developing solutions for using the iPad as a multitouch device controlling auditory scenes. This will have a very practical and immediate application: At my studio at the art academy I have currently rigged 6 speakers according to the setup required for ICCI 360.

However the desk and laptop is positioned outside the ring of speakers, and with the rather challenging acoustics of the studio it is sometime difficult to sit at the disk and properly hear the effect of the spatial positioning of sources that I am performing. With the iPad, I will be free to move away from the desk and step inside the circle.

Late last week I got a solution working based on Fantastick. This is a straight-forward and well-documented solution. I found the Core 2D drawing functions to be slow, but OpenGL ES rendering is snappy. Starting of fromt he help file, and adding some FTM and MnM magic I now have a working and responsive soltuion for a multitouch cousin of the ICST ambimonitor object, as illustrated here, while busy wrapping the patch into a Jamoma module:

Fantastick_002

Here is a short demo video of it in action while I’m busy spatialisating all 11 voices of the famous 4’33" piece:

The only potential downside of this solution is that it depends on image files being downloaded to the app from the internet. If the internet is unavailable in a performance situation, it’s a bummer. This is the reason why I yesterday worked on a solution for controlling web sharing from Max, so that the image files can be hosted locally.

Accidentally I discovered another possible strategy today. The Avatron Air Display app turns the iPad into a second screen. As seen below, this makes it possible to use ambimonitor itself on the iPad. Screen updating on the iPad is sort of slowish, and it is not mutlitouch, but the report back to the main Max application of updated position seems fast and reliable, so with the added monitoring possibilities offering of actually having spatialised sound, this would also work quite well as a rapid prototyping solution.

Iambimonitor

A long term goal will be to be able to create this kind of apps myself, custom-tailored to my particular needs.

Could TouchOSC or OSCemote be used?

August 24, 2010

Touchosc

After the post yesterday I have received some questions and suggestions about looking into TouchOSC. Thanks for the feedback! Back in the days when the blog was hosted using Zope, I got bugged down with spam comments and pingbacks to the degree that I just disabled the whole lot. When revamping the site as a Rails app I didn’t even bother to develop a solution for comments, so it has become pretty much of a monologue…

Anyway, TouchOSC is one of several iPhone/iPad applications available that can communicate with Max using OpenSoundControl. It is a nice application, and could e.g. be used to to mimic the Monome. Part of what makes TouchOSC attractive it the fact that it offers possibilities for customizing the user interface.

Another interesting app offering the same, using standard Max objects to build the interface, is c74.

But when it comes to the kind of interface that I am currently working on, they both fall short. As the screenshot illustrates below, TouchOSC has a GUI widget for controlling two-dimensional positions. But the widget is only able to cater for one point, and one would therefore need to have several of them next to each other. That limits spatial resolution. But more important it greatly reduce the intuitive visual feedback on how different sources relates to each other.

Oscemote

The OSCemote app is getting closer, with genuin support for multiple points within the same interface. However it is not able to keep displaying the positions on the screen once the fingers are released.

If the features for 2D-positional data of the two applications could be bridged, that would make for a nice addition to both of them.

And while I’m on the feature request bag, I am also wondering what Stantum (previously JazzMutant) will be up to next. IMHO the software developed for the Lemur would make for an awesome iPad app. Just take a look at work Mathieu Chamagne was doing several years ago:

PS: The fancy framing of the screenshots is curtesy of the handy iPhone Screentaker application.

Shake the Tree

August 24, 2010

SHAKE THE TREE by Maria Udd and Stine Karlsen was commissioned as a public art work for Nordahl Grieg Videregående Skole, opened in 2010.

Architectures by Link Signatur AS. Ingrid Berven and Trond Lossius served as artistic consultants for the public art commission, with a total budget of NOK 2.000.000.

Norwegian Music Technology Days 2010

August 30, 2010

September 16 – 17, Norwegian Academy of Music

The program for the Norwegian Music Technology Days 2010 is now available. Registration details can be found the same place.

SoundEffects – An Interdisciplinary Journal of Sound and Sound Experience

August 31, 2010

SoundEffects is a new international peer-reviewed journal on sound and sound experience, bringing together a plurality of theories, methodologies, and historical approaches applicable to sound as both mediated and unmediated experience. The journal primarily addresses disciplines within media and communication studies, aesthetics, musicology, comparative literature, cultural studies, and sociology for a humanities-based interdisciplinary approach to sound.

The call for abstracts for the first issue is currently out, focusing on the question of the epistemological potential of sound. How does sound provoke and influence the way in which we experience the world? How can we talk about the phenomenological, bodily experience that sound produces?

More than ever, everyday life is mediated by a multitude of digitized and mechanically reproduced sounds. In Michael Bull’s words: “Waking, walking, driving, working and even falling asleep are all done to music or some other acoustic accompaniment” (Bull, 2003). Never has urban life been noisier. Yet within philosophical, sociological, and aesthetic frameworks, the world is still mostly conceived of as a visual reality. Sound produces meaning in numerous ways, both through the phenomenological, bodily experience of sound and through emerging and changing discourses of cultural practice. Responding to this condition, contemporary sound research must reflect upon the various ways in which the world is experienced through sound.