background image

Archive for June 2009

GIT and SVN (part II)


Cool! I have git and svn working together.

I’ll have to fill in details on how I set it up later, but here are links to web pages where I found the necessary info to set it up:

Michel Chion: Guide des Objets Sonores (1983). English translation.



This seminal text by Michel Chion from 1983 has been translated into English by John Dack (Senior Research Fellow, Lansdown Centre for Electronic Art, Middlesex University) and Christine North, and is available for download from The ElectroAcoustic Resource Site (EARS).

And here is an interview with Michel Chion from 2008.

On artists that write code and artists that do not


by Pall Thayer, July 2008.

Regarding “code” as a medium (once again). I think we have to look a ways back in time to get at the real issues here. We could almost say that differing opinions regarding artist’s use of code are Marshall McLuhan’s fault but I think it’s actually the way that his work has been taught in new media art programs that is at fault. McLuhan talks about media as the end mediator. That which delivers the “message” to the consumers and because of the way this has been taught, that has become the “defacto” definition of a medium even within the arts. However, that’s the exact opposite of older definitions of media within an art context. An artist’s medium used to be the material that he/she manipulated to deliver a message. Both are equally correct but occur at opposite ends of the scale. A medium is simply something that occurs in between and can occur at any point between the artist and the viewer. Netart and the way various artists have approached it has made this whole “system” a bit more complex.

Amongst the median elements of a work of netart we have things like; code, concept, network, computer, screen, etc. It’s important to determine what the “medium” is because as McLuhan tells us, that’s the “message”. One thing that really makes this complicated is that, as an artist who writes code, I don’t think that “my medium” is the same as the “viewer’s medium”. My medium is the code. That’s what I shape and manipulate to convey my “message”. The viewer’s medium can be something else. It could be the Internet or the computer or the screen, depending on how they regard the work. It could even be the code as long as I reveal it. But I’m not really in a position to dictate to the viewer what they may or may not refer to as “the medium”. That’s dependent on their own experience. Regardless, whatever I consider as “my medium” has a big impact on the nature of the work itself. In many ways it defines and guides the creative process.

Artists who produce netart but rely on collaborators to write code for them, will naturally produce different types of work. The code is not their medium and therefore doesn’t “define and guide the creative process”. Something else does. This does not produce a qualitative difference, just a difference.

This is why it “matters” whether an artist writes code or not.


This work is licensed under the Creative Commons Attribution 3.0 Unported License

Paper accepted for SMC 2009



The paper A stratified approach for sound spatialization by Peters et al. will be presented as a poster at The 6th Sound and Music Computing Conference in Porto July 2009.

The paper, written by Nils Peters (CIRMMT), Trond Lossius (BEK), Jan Schacher (ICST), Pascal Baltazar (GMEA), Charles Bascou (GMEM) and Timothy Place (Cycling’ 74), propose a multi-layer approach to mediate across essential components involved in sound spatialization. This approach will facilitate artistic work with spatialization systems, a process which currently lacks structure, flexibility, and interoperability.

I will be attending the conference, in part sponsored by the The COST IC0601 Action on Sonic Interaction Design.

New Max goodies




MusicSpace brings the power of constraint propagation algorithms to the real-time control of musical parameters. This new version substantially improves the original one (no longer distributed), by being totally open (any Max parameter can be controlled), and by bringing an improved constraint propagation algorithm. MusicSpace can be used to control any spatialisation parameter (see the video which shows MusicSpace controlling Ircam’s spat), but also also music mixing (see the Beatles remix), as well as effects control or even real time control of synthesis parameters. MusicSpace is available for free as a Max Java (mxj) object, as well as several example patches and demonstration videos from:



Mikhail Malt and Emmanuel Jourdan have announced a public beta of Zsa.Descriptors, a library of MSP externals for a library for real-time sound descriptors analysis. Grab it here

Zsa.Descriptors is currently Mac OSX only, although several other 3rd party developers hvae offered assistance towards compiling it for Windows as well.

I have not yet had the time to look into it and see how the list of analysers/descriptors match the analysis done as part of CataRT. Still, the basic fact that CataRT is based on FTM and Gabor while Zsa.Descriptors are implemented in the standard MSP externals way is more than enough to warrant that both are valuable additions to the toolbox.


Creative Commons License Licensed under a Creative Commons Attribution 3.0 Norway License. Web site hosted by BEK.