An Update on Tim Thompson’s Space Palette Kinect-Based Instrument Including Video from STEIM

image

I first met Tim Thompson last Spring when I performed in Sunnyvale. He invited me over for an An “Exclusive First Look at Tim Thompson's Kinect-Based Instrument: MultiMultiTouchTouch”. Since then we’ve kept in touch and he’s made incredible progress on this three-dimensional instrument so I thought I'd do an update post.

image

Now called Space Palette, it offers more windows of control and expressiveness and also has interactive visuals that react to gestures. Space Palette’s design makes it a fantastic casual instrument allowing for walk-up players. Tim has been spreading electronic musical joy by bringing it to events such as:

I got to play an updated version of Space Palette when Tim and I were in Atlanta in February. It’s expressive enough to scale beyond casual play and Tim has become quite a virtuoso with it as can be seen in his latest performance at STEIM in Amsterdam below.

Watch embedded video.

I encourage you to visit http://spacepalette.com and http://timthompson.com/ to follow Tim’s progress.

Mark Mosher
Electronic Music Artist, Boulder, CO
Official Web Site: www.MarkMosherMusic.com
Listen/Download Albums: www.MarkMosherMusic.com/music.html
www.ModulateThis.com

An Exclusive First Look at Tim Thompson’s Kinect-Based Instrument: MultiMultiTouchTouch

modulatethis_tim_thompson_multimultitouchtouch_banner

Tim Thompson is a software engineer, musician, and installation artist. He was recently mentioned in Roger Linn’s post “Research Project: LinnStrument — A New Musical Instrument Concept where Roger credits Tim with writing a program that “translates the TouchCo's proprietary USB messages into TUIO messages sent over OSC.”

I met Tim at my recent concert at the Art Institute of California/Sunnyvale and he was kind enough to invite me over to see his latest development project, the MultiMultiTouchTouch. This custom solution offers players any number of arbitrarily-shaped multitouch areas with three-dimensional spatial control. Interaction with this space allows users to control and play virtual synthesizers using nothing but a Microsoft Kinect as the controller.

Ironically, the concept shown in Moog Music’s April Fools video “Introducing the Moog Polyphonic Theremin” is not only a reality, but Tim has one-upped this idea by providing polyphonic spatial control in multiple “frames”, AND more granular control than a Theremin with finger blob detection. In short MultiMultiTouchTouch is like having a polyphonic/multitimbral Theremin that can not only detect hand movements, but finger movements as well – from multiple players!!!

Luckily I brought my video camera along and recorded Tim describing and demoing the technology. I also give the MultiMultiTouchTouch a try at the end of the video. So, without further ado, I present the video “An Exclusive First Look at Tim Thompson's Kinect-Based Instrument: MultiMultiTouchTouch”

Watch embedded video in HD

An Exclusive First Look at Tim Thompson’s Kinect-Based: MultiMultiTouchTouch

Components
In summary, Tim developed with the following components:

The raw output of this controller is OSC messages formatted using the TUIO (multitouch) standard format. Parameters of the software can be controlled with JSON-formatted messages.

Events
If you're near Silicon Valley, you can play with this controller on April 10 at the Stanford DIY Musical Instrument Tailgate Party , or at the Kinect Hackathon at Hacker Dojo. Tim will also be using it in installations at Lightning In a Bottle and Burning Man this year.

Pass It On
I want to reiterate, this is real and NOT a late April Fool’s joke. Incredible work Tim! Congrats. I can’t wait to see where Tim takes this and look forward to the possibility of doing some MultiMultiTouchTouch compositions and performances myself. To help Tim promote his work share this video.

Links

 

Mark Mosher
Electronic Music Artist & Synthesist, Boulder, CO
www.markmoshermusic.com
www.modulatethis.com

Percussa AudioCube Production and Performance Notes for “I Hear Your Signals”

<a href="http://www.markmoshermusic.com&quot; markmosher_rehearsing_01 

For my original music album “I Hear Your Signals” (download the album free) I use Percussa Audiocubes as performance controllers. In this post I’ll give you all the geeky details about how the controllers were applied in the project.

AUDIOCUBES AND MIDIBRIDGE
I used 4 AudioCubes plus Percussa’s free MIDIbridge app on Windows to configure and route AudioCube signals to Ableton Live. I use the same MIDIbridge patch for every song which allows for consistent and predictable data mapping from the cubes to Ableton Live.

In general, I play a lot of the notes on you hear on the album via keyboards, Theremin and Tenori-On live. I tend to use the cubes as controllers, for scene launching, and for real-time modulation of effects and synth parameters and only use them for triggering notes from time to time.

CUBE CONFIGURATION
The AudioCubes are configured with the in the following modes:

  • Cube 1 – Sensor (the red cube at 9:00 in the picture above): This cube sends MIDI CC information back to Live. I configure each side of  cube to give me visual feedback where each cube face is set to a different color. The closer my finger or hand is to the sensor, the brighter the light. Currently, Sensor cubes need to be wired via USB.
  • Cubes 2 & 3 Receivers (white cubes in above picture): Sends MIDI notes back to Live when a signal is received from Cube 4. image I also send RGB MIDI light sequence via MIDI clips in Ableton. The cubes then become light show elements and also offer visual feedback. These cubes are also plugged in via USB so they can receive high-speed transmissions via MIDI clips.
  • Cube 4 – Transmitter (green in the picture above): This cube is wireless. Aligning the faces of this cub with the faces on cubes 2 & 3 triggers MIDI notes back to Ableton Live.

MIDIBRIDGE AND ABLETON CONFIGURATION
I then MIDI map MIDI CC data and Note information coming from cubes via Ableton Live MIDI Map mode to various functions within live.
For cube 1, CC’s are mapped to device parameters and macros. These in-turn are often routed to parameters within VSTs. For example, a cube face might modulate delay time with Ableton’s native Ping Pong Delay FX device. Or the CC might map to filter on a VST synth. Below is a snapshot of the MIDIBridge settings for Cube 1 (click to enlarge).

markmosher_audio_cube_config_cube1

For Cubes 2 & 3, notes are triggered when the face from the transmitter Cube 4 is detected. I route notes to either MIDI tracks holding Ableton instruments or VSTs and/or racks. In some cases I route MIDI notes through a dummy track back to SugarBytes Artillery II running in a send or on the master track for effects. Since effects are triggered via notes rather than CCs with Artillery II, this method allows me to control effects as well as playing notes with signals from Transmitter cubes which only send MIDI note information. In other words, by combining native Ableton effects with Artillery II, I can use any cube in the network to trigger effects.

CUBE USAGE FOR EACH SONG

1) “Arrival”

In this song I’m using AudioCubes as lighting and feedback elements in the live show. They were not used in composition or performance of the music. MIDI clips in Live are used to sequence the lights.

Read more