Creating new performance software for digital artists


For the show at The Emporium I created a very simple circuit using Pure Data to perform the live mix (edit) of the film. I originally intended to expand on this circuit to make a more general tool for live digital media improvisation and for the generation of future installation work.  Unfortunately Pure Data is not the most stable platform to use in a live environment, it also has a lot of vagaries from its audio processing roots that compromise (or confuse) the possibilities of using it as a visual performance tool. Although saying that it is a lot closer to my own ideas as to how digital art tools should behave, in that all data is, for the most part, treated transparently; be it audio, video, still images, text or manipulations and effects. It is node based software, which allows for arbitrary experimentation on the flow of data. It is, as I personally feel, taking the correct direction that digital media tools should take, which is to model the process and not the data. Modelling data, forcing it into a structure, limits the manipulation, interpretation and the general use of the data.

A structure does however need to be applied to data in order to leverage it at all (for example, encapsulating it into a container such as an avi or mov file in order to play it as a time based media sequence) but these kind of abstractions should only be applied to fulfil a specific process and should not be applied prior to the actual creation of the process (the artistic idea). To enforce such structures not only limits the possible explorations of the media (data) but also set within the artist a series of constraints that manifest themselves in a presentiment of what the data is representationally. These constraints directly lead to the creation of work-of-a-type which philosophically is in conflict with the flexibility that the digital medium should be capable of.

Some of these constraints have become the accepted norms for digital media, (standards, recreation of analog approaches etc)  and have led the design of the software we use, and the interfaces that they provide.

The Software Specification

  • General Requirements
  1. Firstly, for myself, I need a piece of software more geared towards digital film improvisation and not VJ(ing). However it must be capable of providing VJ tools too. My own requirements here are quite simple,  in that I need something akin to ‘tape decks’ in that I can assemble sequences of shots prior to a performance. VJ software on the whole is ‘shot-centric’ requiring the performer to chose shots as the artist performs which has the result of enforcing loops (unless the artist wishes to be extremely busy) and resulting in the artist becoming less reactive to any concept of creating a narrative.  What I need is continuously playing sequences of shots that I can skip forward and backward through,  (when I’m improvising music I’m reacting to a stream, an evolving composition, and not a singular sound) and therefore more capable of feeling out a narrative.
  2. It needs to be able to unpack source material that is structured in any format into raw data that can then be used to either alter other data or merge into alternative abstractions in order to take new forms. Malleable representation.
  3. It needs to have a flexible interface capable of sending or receiving data from external devices and from other software (OSC, Midi etc)
  4. It needs to have a simple API allowing for additional functions to be added as well as supporting current visual APIs such as OpenGL and FreiOr. Some functions required should also be mathematical and other arbitrary coded data manipulations.
  5. It needs to have an interface that is flexible and intuitive enough that it encourages artists to experiment with the digital medium in order to create new forms of work.
  • Software workflow

The workflow should result in creating modular solutions at each stage within the workflow that the user can reuse to create individual performances.

The first phase of the workflow is to gather together the initial source material into collections and/or linear sequences (video, audio, stills, etc).  These collections should be selectable as sources either prior to a performance or during a performance. There should be no difference in behaviour to the user between built sequences of shots or collections of individual shots.

The second phase of the workflow is to build the circuit for the performance. This is where the user can decide how the data is interpreted and/or manipulated and also set any outputs or inputs for the performance. This will be a node based interface mostly following the approach of the composition software Nuke, allowing the user to connect manipulations/sources together on a stream of data (source, video/audio/still etc) as well as make connections between variables and outputs inherent within each node to variables and inputs within other nodes (such as linking a movement sensor to a video mixers crossfader). Following the Nuke approach (and Pure Data’s too) any circuit built should be collapsible into single nodes (Nuke groups, Pure Data sub-patches, abstractions) that can be used transparently in other circuits.

The third phase of the workflow is to build the interface the artist will work with (or this stage would be skipped if the artist is creating a stand-alone installation work). This will follow the method used by Native Instruments Reaktor software which automatically generates an interface based on the circuit that has been built. The interface generated should then be modifiable. As I’m taking a modular approach it should be possible to have more than one interface available for a single circuit and allow the user to switch between interfaces while performing.

To better explain how the software will function I’ve sketched an example of how I see it being used. The example is a basic mixer, this will be built graphically by connecting simple nodes together like this.

Each node, like all nodes within the build circuit interface, will have an associated parameter window where its behaviour can be set, and also where any parameters that control its behaviour can be linked to parameters of other nodes within the circuit, or be set by expressions. Once a circuit is built the software will then build an interface for the user based on the circuits structure (a la Reaktor), and which parameters the user has requested to be published on the interface. There is no functional difference internally between a preview screen and the output screen, the only difference is where the user asks for it to be displayed (within the interface or on a separate window/desktop) and it’s size. All screen nodes created in a circuit can be rescaled within the interface.

The interface this basic mixer circuit will be (roughly) in the form below. The real interface will have a lot more controls, particularly on the mixer to allow the cross fader to be more assignable and to control the transition type (wipe/cut/fade etc).

This is a really simple example, I will in the next few days publish more accurate mockups and more complex circuit examples featuring OSC nodes, expression based nodes and circuits that contain live camera inputs, as well as sound and visual effects.

  • Development Platform

After researching various approaches I’ve decided to build the software using open source libraries and APIs. The language I’ll use will be Python, which has a structure and methodology that fits perfectly with with the kind of application I’m trying to build. For processing the video & audio I will be using the media library Gstreamer ,which again has design principles that fit exactly my requirements.

For creating the interface I will use Glade for most of the windows and controls, and goocanvas to create the node based interface of the circuit builder. I will give more details on these development tools and libraries and my reasons for chosing them in later posts.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: