Celebrating Leon Theremin's Birthday with Video, Notes, Links, and a Soundcloud Set
How To Change Sonic Charge uTonic Drum Machine Patterns on the Fly Within Ableton Live

Percussa AudioCube Production and Performance Notes for "I Hear Your Signals"


For my original music album "I Hear Your Signals" (download the album free) I use Percussa Audiocubes as performance controllers. In this post I’ll give you all the geeky details about how the controllers were applied in the project.

I used 4 AudioCubes plus Percussa's free MIDIbridge app on Windows to configure and route AudioCube signals to Ableton Live. I use the same MIDIbridge patch for every song which allows for consistent and predictable data mapping from the cubes to Ableton Live.

In general, I play a lot of the notes on you hear on the album via keyboards, Theremin and Tenori-On live. I tend to use the cubes as controllers, for scene launching, and for real-time modulation of effects and synth parameters and only use them for triggering notes from time to time.

The AudioCubes are configured with the in the following modes:

  • Cube 1 - Sensor (the red cube at 9:00 in the picture above): This cube sends MIDI CC information back to Live. I configure each side of  cube to give me visual feedback where each cube face is set to a different color. The closer my finger or hand is to the sensor, the brighter the light. Currently, Sensor cubes need to be wired via USB.
  • Cubes 2 & 3 Receivers (white cubes in above picture): Sends MIDI notes back to Live when a signal is received from Cube 4. image I also send RGB MIDI light sequence via MIDI clips in Ableton. The cubes then become light show elements and also offer visual feedback. These cubes are also plugged in via USB so they can receive high-speed transmissions via MIDI clips.
  • Cube 4 – Transmitter (green in the picture above): This cube is wireless. Aligning the faces of this cub with the faces on cubes 2 & 3 triggers MIDI notes back to Ableton Live.

I then MIDI map MIDI CC data and Note information coming from cubes via Ableton Live MIDI Map mode to various functions within live.
For cube 1, CC's are mapped to device parameters and macros. These in-turn are often routed to parameters within VSTs. For example, a cube face might modulate delay time with Ableton's native Ping Pong Delay FX device. Or the CC might map to filter on a VST synth. Below is a snapshot of the MIDIBridge settings for Cube 1 (click to enlarge).


For Cubes 2 & 3, notes are triggered when the face from the transmitter Cube 4 is detected. I route notes to either MIDI tracks holding Ableton instruments or VSTs and/or racks. In some cases I route MIDI notes through a dummy track back to SugarBytes Artillery II running in a send or on the master track for effects. Since effects are triggered via notes rather than CCs with Artillery II, this method allows me to control effects as well as playing notes with signals from Transmitter cubes which only send MIDI note information. In other words, by combining native Ableton effects with Artillery II, I can use any cube in the network to trigger effects.


1) “Arrival”

In this song I’m using AudioCubes as lighting and feedback elements in the live show. They were not used in composition or performance of the music. MIDI clips in Live are used to sequence the lights.

2) “First Orbit”
This song was composed specifically to be performed with AudioCubes. Below is a video of the song performed live as part of a talk at I gave Ableton Denver. There is a link to this full talk at the bottom of this post.

Watch embedded video.

The Receiver cube to my right (Cube 2) is being used to launch scenes in Live. The cube on my left (Cube 3) is being used to send MIDI notes to Artillery II which trigger effects. It’s also used to change the Live’s arrpeggiator gate rate for an Absynth 5 track in discrete intervals.


At about the 2:00 mark, I start using the modulating parameters with Sensor Cube (Cube 1). The closer my fingers or hands are to a cube face, the deeper the modulation. I’m changing Arpeggiator gate and distance, Redux, and PingPong Delay dry/wet simultaneously. The PingPong delay is on the master track so near the end around 2:50, I soak the entire mix in delay (the cube turns blue) to hide the transition to a new scene. At around 3:03, I’m using visual feedback from Cubes 2 & 3 to determine where I am the scene and then use I my hand to quickly dry out the mix a half note at a time. Another scene change then the songs ends when I use the transmitter cube and Receiver cube 3 to send a MIDI note to Artillery II telling it to apply an extreme effect.

3) “Control Zone”

<a href="http://markmosher.bandcamp.com/track/control-zone">Control Zone by Mark Mosher</a>

Play embedded audio

This song is a combination of clips I composed with virtual synths combined with real-time performance from the Theremin (for Theremin notes see this post). The Theremin allows for 2 dimensions of spatial control, pitch and volume. I added my Sensor AudioCube to the performance by simply placing the cube on the Theremin between at the center to reduce interference with antennas. In this position the cube is not sensed by the pitch antenna (right), and only slight effects the volume antenna on the left.


This second shot shows me rehearsing with this setup.


This is just an awesome setup as I now get an additional 4 dimensions of control without touching any physical knobs or buttons. I change the Theremin routing and cube mappings on the fly using Novation Launchpad buttons to arm and select tracks. Typically, the AudioCube faces are mapped to PingPong Delay Dry/Wet, Filter, Resonance, and Resonator Dry/Wet which I use to emulate guitar feedback.

4) “I Can See Them”

<a href="http://markmosher.bandcamp.com/track/i-can-see-them">I Can See Them by Mark Mosher</a>
Listen to embedded song 

This song is performed entirely with AudioCubes and no other controllers. The opening drones and vocal samples are all triggered with interactions between the Transmitter cube and Receiver Cube 3. I use Cube 2 for scene changes. I make heavy use of Sensor Cube 1 to modulate synths and the mix. Sensor cube faces are mapped (in some cases I use multiple mapings on one face) to parameters like Beat Repeat Grid, Delay, Chance, Pitch; and to the X/Y performance pad on Camel Audio Alchemy (checkout this post on the details of using AudioCubes with Alchemy).

It becomes quite improvisational and in some ways generative based on my original sequences. I use Cubes 2 & 3 as lighting elements with synchronized MIDI clips. The video below is an excerpt of me rehearsing the song.

Watch embedded video

5) “When Connected”
AudioCubes are not used in this song.

6) “Celebration and Voices”
AudioCubes are not used in this song.

7) ”Dark Signals”
AudioCubes are used as lights in this song.

8) “Resolute”
AudioCubes are used as lights in this song.

Like with anything complex, getting this working at this level and to the point where you can transcend it takes planning, experimentation, and rehearsal. For me that’s part of the joy of electronic music :^)

One thing I want to point out is using light sequences with AudioCubes is a great way to get immediate visual feedback. Since you have complete control of RGB commands via MIDI clips from your DAW, you can also program visual cues to help you remember where you are in a scene. This also makes things more interesting for audience members as they can see a correlation between your movements, the visual feedback, and the sound coming out of the speakers. Also, since cube sequences are MIDI clips, they are locked to tempo which is quite cool.

A note about album recording. For most songs on the album, I composed and arranged them for live performance around my controllers like the AudioCubes. I then rehearsed the songs for months then recorded them into Ableton Live capturing and mixing pre-recorded scenes, real-time automation, and live play in one pass. I then did minor edits then mixed them down.


Mark Mosher
Electronic Musician, AudioCubist, Boulder CO