A C C O U N T

B A N   T H E   R E W I N D

My name is Stephen Schieberl. I am a Creative Technology Lead at Wieden+Kennedy in Portland, Oregon. I build everything from large format installations to experimental interactive art to database-driven web sites and more.

Friend on Facebook
Ban the Rewind?

UPDATED 2011-12-16: Follow this link for the next generation of Radial: http://www.bantherewind.com/multi-touch-live-pa I make electronic dance music as " Let's Go Outside ". I also work at Fashionbuddha , developing everything from Flash sites to public installation pieces. In early 2010, I decided to combine my passions for music and interactivity into a live set up suited specifically to my production and performance style. Awaiting the advent of affordable, desktop-power multi-touch computers, I had to put hardware together to go with it. Pictured: Let's Go Outside performing live with Radial in visible light mode at a boat party in Seattle. SOFTWARE At Fashionbuddha, I'd recently developed Magnetic , a powerful base application, built on openFrameworks and OpenCV, which does some nice computer vision sorcery. The performance and accuracy I was getting out of it was astounding, and I immediately started thinking of ways to build a loop-based music layer on top of it. Thus was born the forever-tentatively titled "Radial". The concept is simple. There are three rings on the screen. The outside ring represents the volume range. Anything outside this ring is silent. The middle ring is 100% volume. Moving sounds from the outer ring to the middle one is akin to sliding a volume fader from 0 to 10. The center goes back down to 0% volume and removes the sound. Dragging a sound from the outside ring to the center in a straight line would make it go from silent to full volume, back to silent, and then removed from the stage. Sounds are loaded with a key press -- this prevents the occasional false blob from inadvertently loading a scene. The set data is stored in a very basic XML file. Radial maintains a master clock, as indicated by a "doppler" type visual in the rings which loops every sixteen notes (or however long is specified in the XML). Each sound also has its own "doppler" to represent its progress. By getting these sounds out of the traditional "grid" layout, I can crossfade elements of two or more songs, rather than cut one off to replace another as I'd have to do with other software. If there's a sound I really like that I want to leave in for a while, I can leave it on stage even when I'm finished performing a song. Collision detection between objects allows me to replace one sound while introducing another by pushing the old sound out of the way with the new one. This approach removes the constraint of channel count limits. I can load a track with twenty channels and not worry about being unable to perform it on an 8-channel controller. I don't have to associate clips on a computer screen with faders and buttons on a separate hardware controller, or deal with the latency intrinsic to such a set up. A single class receives touch data, renders the channel information to the screen, and plays the audio. There is no latency beyond the slight lag of the camera driver, so there is this feeling of a real connection to the audio. Pictured: Let's Go Outside performing live at EMP on Radial HARDWARE Before "Radial" was a thought in my mind, I'd developed a multi-touch MIDI controller. First in Flash and then a far better version C#.NET/XNA. However, I lacked a practical portable device on which to use my software. I dreamed up several designs before deciding on a LCD-LLP set up. I picked up a 22" LCD monitor online and got in touch with Brendan Budge , a brilliant (post-)industrial designer and builder who made our FTIR table at Fashionbuddha. I ripped all the plastic off the monitor, boiling it down to little more than the screen itself and about 1" - 2" deep of metal and guts. Brendan built a tough and beautiful cedar frame that held the screen at an angle. He even threw in some ground FX -- a strip of LEDs that gives the floor of the device a subtle red glow. The top corners of the monitor each yield a cylinder-shaped slot to house a 780nm laser. Thumbscrews are used to clamp the lasers in position. The lasers have line generators on them which create a plane of light across the surface. Touching the screen breaks the plane, which is picked up by a PS3 camera modified to only see this bandwidth of light. The light reflections are isolated in my software to "blob" shapes that are interpreted into touch data. The wires from the lasers are fed into tiny holes so the power connection happens under the hood. The camera mount is a creative piece of hardware of Brendan's own design. There is a ball-bearing with a threaded hole attached to the frame. An 8" pipe is screwed into this ball bearing. At the other end of the pipe is another ball bearing with a metal plate onto which I mount the camera. So the mount essentially has two joints and seems like a steampunk-style robot arm. The arm is secured into place with an Allen wrench. I've done one, super-accurate calibration for this device, and each time I set it up, I just adjust the camera position to line up blobs with my fingers. When traveling, I loosen the arm and lay it flush against the screen. The laser beams reach the opposite side of the frame, producing an outline in the camera's view that breaks when my hand crosses over it, resulting in an unintended blob. I had Brendan build a metal, C-shaped "border" that I can place over the screen to hide this reflected light from the camera's view. We also theorized that the bass and dancefloor vibrations in some settings could shake the camera too much so there's an optional, slick-looking steel and chain "hammock" in which the device is suspended. It works well in absorbing the shock, but has been unnecessary to date. The whole device weighs about fifteen pounds assembled, and has already traveled in a standard suitcase for a gig. This thing is built tough, so it travels well. VISIBLE LIGHT MODE After a few gigs on this thing, I came to the realization that lasers can be a real hassle. Twice they've burned out during a set. I'd already developed a "peak tracking" mode for Magnetic, which I was using to identify limbs on people in visible light. Why not use it on my hands? I picked up a second PS3 camera and wide angle lens, this time with an infrared lowpass filter to capture only visible light. I added a hot key to Radial that lets me switch between blob and peak tracking modes. In peak tracking, or "visible light", mode, Radial's background turns white, turning my hands into silhouettes. This makes it easier for Magnetic to pick out my fingertips and turn them into controls. It's a bit tricker to operate Radial like this, but far more reliable.