Previous month:
July 2012
Next month:
September 2012

Posts from August 2012

Music Monday: An Interview with Kent Barton (aka SEVEN7HWAVE) on His New Concept Album “CYBERIA”


Denver artist Kent Barton (aka SEVEN7HWAVE) just released a new concept album called CYBERIA. I met first met Kent at the Ableton Colorado User Group a few years back when he was just starting down the path to create this album so I thought it would be interesting to hear about his creative process.  Oh, and Kent is also a member member of the new Boulder Synthesizer Meetup.

First I’ll offer some links to the album, then the interview followed by Kent’s social links. Kent is offering this new album ”name your price” over on bandcamp and as always I encourage a buy to show your support.


Hong Kong: 2050 A.D. You're about to inject a dose of mind-altering nanobots. This is the soundtrack to your trip.

Concept and Production: Kent Barton
Mastering: Tarekith at Inner Portal Studio
Vocals on Brain Zaps: Brittany Patterson
Field Recordings on No Passengers, Kowloon Bay, and Brain Zaps: swuing
Field Recording on 0100000101001001: James Tobin
Muse: Brittany Patterson
Creative Inspiration: Mark Mosher, Marc Wei, Matt Stampfle, and the Denver Ableton User Group

Interview with Kent Barton

Mark: Tell us a little bit about your musical background. What instruments do you play and how did you first get interested in electronic music?

Kent: I had some formal classical training on the violin as a child. Even though I got tired of the instrument by middle school, it did a good job of wiring my brain for music. Fast-forward to the start of college, and I decided to pick up the guitar to emulate my metal heroes. That was my introduction to the world of songwriting, bands, live shows, and the search for the perfect tone.

Back around 2004, I discovered the Trance station on Shoutcast (!). A year or two later I got my first proper introduction to electronic music, clubs, and raves, with artists like Ferry Corsten, Junkie XL, and Infected Mushroom. Eventually my waning interest in playing intricate guitar riffs was replaced by a newfound lust for producing music.

Mark: What inspired you to create an album about “mind altering nanobots” in 2050 A.D.?

Kent: I’ve always been a sci-fi freak with part of my brain permanently lodged in the future. Blade Runner was an obvious inspiration here, along with the cyberpunk movement. But it’s also a commentary on where we are today, and where we could be headed. Technology is a double-edged sword; it can liberate us or imprison us. The internet connects us all, but it’s also a giant Big Brother machine. These two opposing forces will be even more important in the future, as computers get smaller, faster, and implanted into our bodies.

Creatively, I was inspired by Reboot and I Hear Your Signals (editor’s note – I did not bribe Kent to say this :^) ). The idea of a badass album telling a story has been around for a long time (Operation: Mindcrime, I’m looking in your direction), but it never dawned on me to use the same technique for electronica until hearing these two albums.
Mark: What role did Ableton Live played in your creative and production process?

Kent: Occasionally I’d go lo-fi and hammer out a melody or chord progression on my guitar. But other than that, Ableton was the centerpiece of everything, from sketching out ideas to recording to arrangement to mixing. People keep bitching about when Live 9 is coming out. I honestly don’t care; the current version is powerful enough to do everything I want to do.
Mark: What were your go-to synthesizers for this project and what is it you like about them?

Kent: My mainstays were…

Sylenth1: When I think bass, I think old-school West Coast hip-hop smooth-ass warm sub. That’s what I was aiming for, and Sylenth delivered. I also used it for the pad sound on “Vimanas,” which was my obligatory nod to Vangelis.

Peach: This freeware, from Tweakbench, had exactly the chiptune sound I wanted for this album. Pure NES awesomeness…and it sounds even better with some spatial FX slathered on.

Plogue Chipsounds: I re-sampled Chipsounds for a lot of FX, as well as the main bleep lead on 0100000101001001. It’s an 8-bit emulation powerhouse.

Mark: There is a consistent palette throughout the album which helps give listeners a sense of the “universe” the story takes place in.  Did you have a sense of the palette from the beginning, or did this evolve as the production progressed?

Kent: Early on, I stumbled across a collection of incredible field recordings someone made while traveling in Hong Kong. This inspired the setting of the album. As I was writing, these served as the “glue” between each track. I also started with a simple equation that I thought might yield awesome results: Chiptune + Strings + Guitar – fusing the organic and electronic. But as the album evolved, I found myself downplaying the guitar element and bringing in more synth.
Mark: I love how you modulated the arpeggiator speeds in “No Passengers” and also changed the glitch speeds in “Brain Zaps”. Did you record real-time automation for this or use automation envelopes?

Kent:  The changing arp speed on “No Passengers” was recorded in one pass. I like to limit myself to one or two takes to capture the moment and avoid endless tweaking. “Brain Zaps” was one of those cases where I forget to connect a controller when I’m arranging a track. Rather than stop the workflow, I’ll just draw in the glitches by hand.
Mark: How do you feel composing against a story line helped you keep the project moving to completion?

Kent: Having a storyline was incredibly helpful. It created a common thread throughout the songs, and added visual elements to the creative process. Sometimes I felt like a movie director, rather than a producer. Creating an environment and living in it was also a huge help – especially when I was stuck and didn’t know where to go next.

I can’t recommend this enough. If you’re a producer looking for inspiration that can drive an entire collection of songs, try thinking of a story to tell. You don’t need a deep plot or characters. Just a simple concept is enough to fuel that creative spark.

Mark: What is your next musical project?

Kent: I’m working with an incredible animator/visual artist on a video for “No Passengers.” I’ll be releasing that shortly.

I freak out if I’m not writing, so I’m also cobbling together the building blocks for my next album. I feel like I’ve found my own sound with Cyberia, Now I’m excited to evolve it and take it in new directions.


Electronic Musician, Boulder CO

Electro-Music 2012 Festival in NY Sept. 7-9


Electro-Music 2012 Festival Is Almost Here

The fantastic lineup for Electro-Music 2012 Festival has been announced. This wonderful event is held at the Greenkill Retreat Center in Huguenot, New York so if you live nearby I recommend you check it out. If you’ve never been, read my show report from my first trip here.

If you can’t make the event, make sure you mark your calendar and tune into the live stream a

More info and tickets at

See You At the Festival

I’m excited to announce I’ll be returning to Electro-Music 2012 Festival in NY for the third year running. Here is a blurb for my events that will appear in the program.

Friday Sept. 7, 10:30 PM - Expressive Visual Controllerism Concert
In his third year at Electro-Music, Mark will continue to explore the boundaries of controllerism with Ableton Live coaxing even more expressive real-time performances out of his various virtual synthesizers. It’s all about original sci-fi techno songs and experimental improvs filled with classic and signature synth ear candy performed on keys, grids (Launchpad/Tenori-On), Theremin, AudioCubes, and Lemur.  Visual feedback FTW!

Saturday Sept. 8, 4:00-5:00pm - Creating and Controlling Signature Sounds
with Camel Audio Alchemy
Alchemy offers a great blend of performance features combined with deep synthesis, sample manipulation, resynthesis, granular synthesis, and built-in FX. There is also a mobile app that can act as a wireless multi-touch controller for Alchemy on your computer.  The goal of this talk is to shorten your learning curve so you can start creating your own expressive signature sounds for production and performance. Topics will include: Anatomy and signal flow, programming walk-throughs, tips on incorporating field recordings into patches, overview of modulation mapping, overview of FX, and performance mapping.

A Look Back at EM 2011

Playing the song "I Can See Them" live.  I'm controlling 4 dimensions of at the same time. Visuals behind me are by artists Project Ruori who are processing a live camera feed. Photo by Hong Waltzer.


The photo above is of from my 2011 performance and was taken by Hong Waltzer. Checkout out more photos from 2011 here. Below is a live recording of last year’s concert (downoad mp3 here).

I hope to see  you at the show.

Electronic Musician, Boulder CO

My 2012 Go To Virtual Synthesizers


I own a lot of virtual synths :^)  As part of a voluntary simplification exercise I started in January,  I’ve been limiting myself to a smaller number of instruments over the last year so I could go deeper and create more expressive and unique signature sounds for compositions and live performance. The image above (click to go to interactive map and then click branches learn more about these synths) shows a mindmap of synths I’ve been most drawn to over the last year. In other words, these are the instruments that consistantly make into my tracks like “And What do the Trees Hear When the Wind Blows”, “Orbiting Miranda”, and “Now is Now Remix”.

When narrowing down to this list, I worked to find a very complimentary set of instruments with great workflow. The instruments range in character from pure synthesis instruments (Zebra and Predator), to sample-based instruments (Sampler, Iris), to hybrids (Alchemy, ElextraX) to virtual drum machines (utonic). The instruments with green dots in front are ones I’ve been spending 100s of hours with working to create signature "patches” from scratch that I’ll use in future compositions, productions, and live performances. I should also note that I’m also using many of these synths as effects processors allowing me to capitalize on the investment I made learning the synth workflows (here is a post on this notion) .


For those not familiar with some of these synths checkout some audio samples from past sound design experiments. First is a clip with Alchemy (download MP3) where I use granular synthesis to repurpose the field recording of a fluorescent light bulb.

Here is a little behind-the-scenes video on the creation of this patch.

Here is another example where I use Alchemy (download mp3) to repurpose crowd noise from a CU bastkeball game, a morse code key, and add in something called factalized waveforms.

Next is a Zebrify patch where I slowly pitch up and then process this incoming signal of a Theremin with two comb filters with the pitch of filters being modualted by a step LFO (download mp3)?

Next Steps – Deeper with the Top 3

As I go into the fall I’m going to be spending a lot more time with Zebra and Alchemy. They are both extremely deep and very complimentary. They nicely cover the entire spectrum from pure synthesis to sample mangling. Absynth, which I bought in 2002, is the first virtual synth I ever owned so holds a special place in my rig. I’ll be doing some synth work with it as well but will focus heavily on using it as an effects processor.

Which Should You Pick?

If you have limited funds or time and just want to go deep with one synth, you can't go wrong if you pick one of the three mentioned in the previous paragraph. Again, Zebra is pure synthesis (no samples) and semi-modular. Alchemy is great at resynthesis and sample mangling so if you are into field recordings this is your best bet. Absynth is somewhere between the two and is a great pick if you want to work with extreme multi-segment envelopes and very interesting and unusually effects. I give them all 10/10 and the deeper you go, the more you’ll be rewarded.

If you are looking for a fantastic subtractive that can also be used as an effects processor Predator is fantastic choice. If you want a hybrid with subtractive workflow with visual feedbak, ElectraX is a good bet.

Controllerism with the Top 4

Now that I’ve further narrowed my list, I’m working on templates for various controllers to get even more expressive results with Zebra, Alchemy, Absynth and Predator. I’m using the Alchemy Mobile to control Alchemy on my computer, I’m working on a custom Lemur template for Zebra and Absynth. I’ll also be working on mappings for my Novation Remote SL and refining my AudioCube patches for these synths.

I’ll leave you with a video I did some time ago showing the use of one Percussa AudioCube face in sensor mode to play a note plus send MIDI CC info to control the XY of Alchemy.

Mark Mosher
Electronic Musician, Boulder CO

How to Trigger Absynth 5 FX Envelopes with MIDI Notes in Ableton Live


In a previous post I mentioned the use of stand-alone FX versions of synths like Zebrify,  Absynth FX, and Predator  FX as effects processors. Here is a tip for you Absynth FX fans who are using Ableton Live and want to process the output of a another virtual MIDI instrument and then trigger envelopes in Absynth FX.

MIDI Won’t Pass Through By Default

If you drop Absynth FX into a MIDI track  that already has a synth in it – say ZebraHZ - the audio will be processed as you’d expect via the patch settings. MIDI notes however are not passed through to Absynth FX which means the envelopes won’t be triggered.

Ableton Routing to the Rescue

  1. Drop Absynth FX into '”A Return”
  2. Route the ZebraHZ audio to with the “Send A” knob.  I set the track to “Sends Only” so I only hear audio routed through Absynth FX.
  3. Create a MIDI Track in Live and route the MIDI to “A Absynth 5 FX” return. (Click the image above to see the screen shot full screen).
  4. At this point you can arm both the MIDI Track and the ZebraHZ track and play in real-time and notes will go to Absynth FX as well. Instead, I created a MIDI clip with a note pattern that repeats and…
  5. triggers the Absynth FX Envelope while I’m playing different notes in ZebraHZ.

Ok, give it a go and expect crazy and interesting results from the most epic Absynth FX and monster envelopes implementation.

Mark Mosher
Electronic Musician, Boulder CO

Working on Lemur Interface for U-HE Zebra


In recent posts I mentioned I was experimenting with the iPad with Alchemy, Slim Phatty, and iTnri. The Dark Zebra (ZebraHZ) release got me back into synthesis with Zebra in a big way. As I dug into the patches by Hans Zimmer and Howard Scarr,  I discovered they not only map XYs but Aftertouch and Breath controller. In considering how to map to this, I bumped into this article by Audio Newsroom Interview with Zebra creator Urs Heckman called “Off-the-record: Urs Heckmann (u-he)”. They asked:

About Zebra, in the past I remember you told me there was a remote chance to see an hardware product out of it. Did the idea evolve somehow and are you still interested in it?
I'm all for it but I can't do it alone. I've spoken to some people but I found nothing yet appealing enough to take a risk. I'd love to provide the industrial design for a zebraesque controller keyboard with 4 joysticks though.

While I think a dedicated hardware controller would be epic, it seems unlikely any time soon - so started working up an interface on Lemur with 4 XYs and thought I’d share the photo above.

Lemur Template Notes

I started with the “iPad – Studio Combo” template and:

  1. Scaled the keys and octave button down to make room for more objects
  2. In addition to Mod Wheel and Pitch Bend, I added sliders to the right for Aftertouch and Breath Controller
  3. Along the top you can see 4 XY (Multiball objects). Below the first one you can see 3 horizontal sliders that I’m going to map to physics params for the XYs for Friction, Attraction, and speed.
  4. I added 2 ADSRs (red and green) which I map to Envelope 1 & 2 in Zebra.
  5. Top right I have two custombutton objects. The first is mapped to a note so I can trigger a drone or arpegio (Zebra doesn’t have a “hold” on the arp). The second is mapped to MIDI sustain and toggle sustain on and off.
  6. Just to the right of the last XY there is a set of horizontal bars. I don’t have these quite working right but they are for the “VoiceMode” which will allow me to turn the arp on and off.

MIDI Mapping Notes

To map the Lemur to Zebra I use MIDI Learn by right-clicking on the target controls in Zebra then change the value of the corresponding object on the Lemur. This actually saves these mappings to com.u-he.Zebra2.midiassign.txt file so they work even after you close and re-open the host or add Zebra to a new set.  Note that this file is shared between Zebra 2.5 and ZebraHZ so once you map Lemur works for both which is a real time saver. Note you can edit this file in a text editor (backup before you mess with it) and delete it if you want to reset mappings (it will be re-created when after you re-launch your host and start Zebra).

In Use

Like my experience with Alchemy Mobile as a controller for Alchemy VST, using Lemur running wirelessly on the iPad to drive Zebra  is really captivating. It’s very organic and the ability to add physics with say XY ball movement takes performance to the next level.

As a result of my great experience with these controller interfaces, I find myself using iPad more  and more in the role of controller instead of iOS synth platform.

Moral of the Story

iPad interfaces to PC synths make me use the PC synths more :^)

Mark Mosher
Electronic Musician, Boulder CO