All posts tagged ITP

Gloves Video Controller

Six of us at NYU’s ITP Camp decided to follow The Gloves Project’s patterns to build our own gloves in June. These are sensor-laden gloves that can be used to control software through hand gestures. Our group included musicians, a theatrical sound designer, a gamer, and visualists, each with different uses for the glove in mind.

To get an idea of how it can be used with video in a live setting, take a look at this test clip, where I use hand movement to wirelessly control video playback and effects.

Here, sensor values on the glove are sent via Bluetooth to a decoder patch written in Max, and then out as MIDI controller data to VDMX, VJ software. It works!

Gloves have been used as controllers in live performance for some time — see Laetita Sonami’s Lady’s Glove for example. Our particular design is based on one created for Imogen Heap to use as an Ableton Live controller, so she can get out from behind a computer or keyboard and closer to the audience. She gives a great explanation and demonstration at this Wired Talk (musical performance starts at 13:30).

Heap and The Gloves Project team are into sharing the artistic possibilities of this device with others, as well as increasing the transparency of the musical process which can be obscured inside a computer. This is an attitude I’ve believed in since attending MakerFaire and Blip Festival in 2009, where I saw a range of homemade controllers and instruments. I was much more engaged with the artists who made the causal process visible. It doesn’t have to be all spelled-out, but in certain cases it helps to see the components: the performer is making the things happen. This is obvious with a guitar player, but not so much with electronic music. Also, you get a different creative result by moving your arms than pressing a button — a violin is different from a piano.

The Gloves Project has a residency program where they’ll loan a pair of gloves to artists, plus DIY plans for an Open Source Hardware version. The six of us at ITP Camp built one right-hand glove each. We had to do a bit of deciphering to figure everything out, but we had a range of skills between us and got there in the end.

Each glove has six flex sensors in the fingers (thumb and ring finger have one each, and index and middle have two each, on the upper and lower knuckle), which are essentially resistors: the more they bend, the less electricity passes through. This can be measured and turned into a number. The sensors run to a tiny programmable ArduIMU+ v3 board by DIYDrones, which uses Arduino code and includes a built-in gyroscope, accelerometer, and magnetometer (a compass if you attach a GPS unit for navigation). This is mostly used for flying things like small self-guided airplanes, but also works for motion capture. We make a serial connection to the computer with a wireless bluetooth device.

Here’s a wiring guide that we drew up.

We had more trouble with the software side of things. The Gloves Project designed is to communicate with their Glover software, written in C++ by Tom Mitchel. There are instructions on the website, but we couldn’t reach anyone to actually get a copy of the program. In the end, we copied the flex sensor sections of Seb Madgwick’s ArduIMU code and used it to modify the ArduIMU v3 code. It delivered a stream of numbers, but we still had to figure out how to turn it into something we could use.

We formatted the output sensor data like this:

Serial.println("THUMB:");
Serial.println(analogRead(A0));
Serial.println("INDEXLOW:");
Serial.println(analogRead(A1));
Serial.println("INDEXUP:");
Serial.println(analogRead(A2));

…and so on. I then programmed a patch in Max to sort it out.

Details:

When one of the sensors’ name comes through, Max routes it to a specific switch, opens the switch, lets the next line through (the data for that sensor), and then closes the switch. Data goes where we want, and garbage is ignored.

Every glove and person is slightly different, so next the glove is calibrated. Max looks for the highest and lowest number coming in, and then scales that to the range of a MIDI slider: 0 to 127. When you first start the decoder, you move your hand around as much as you can and voilĂ ! It’s set.

I made the default starting point for flex sensor data 400, since the lowest point sometimes didn’t fall below 0, while the peak was always above 400. The starting point for movement data is 0. There’s also a “slide” object that smooths movement so it doesn’t jump all over the place while still being fairly responsive.

The number is now sent through a Max “send” object with a different name than the raw sensor data. If you’re keeping everything inside Max, you can just set up a corresponding “receive” object.

Otherwise, it gets turned into a MIDI control or note value, and sent out through a local MIDI device or over a network.

Finally, I tidied everything up so it’s useable in presentation mode. Anyone can download the patch and run it in Max Runtime (free).

There are probably more efficient ways of doing this, but it’s our first pass to get things working.

To download all our code, visit https://github.com/timpear/ITP-Gloves/

Since finishing that, I discovered that The Gloves Project has released a whole range of decoders / bridges in various languages. Their ArduIMU code has lots of clever deciphering on the gloves end of things, and the bridges primarily output OSC instead of MIDI, which is handy. Beyond that, The Gloves Project continues to develop new versions of gloves, and are worth checking up on.

Our decoder simply translates the raw sensor data. The next step is to get it to recognize hand gestures, and trigger specific events or adjust values based on that (which is what the Glover software does). We also need to program the glove’s RGB LED and vibration motor for feedback from the computer.

I showed this project to Karl Ward (rock star, Ghost Ghost collaborator, masters student at ITP), and it turns out that he’s currently working on an Arduino library to do a lot of this work, only more elegantly, within the controller. The first library is Filter, which he augmented over the summer to require another new library he wrote, called DataStream. He says: “They are both in usable, tested shape, but the API is still in flux. Right now I’m looking for folks who have Arduino code that does its own filtering, or needs filtering, so I can design the API to fit the most common cases out there.” We’re going to jam.

The glove has all sorts of possible artistic applications, but what else? When I showed it to my dad, he wondered if it could be used as a translator for sign language. Brilliant. It sounds like Microsoft is currently developing software for the Xbox One and new Kinect that will do this, although one advantage of a wearable controller in any case is the ability to get away from a computer (within wireless range). One of the people on our team is going to use it to adjust audio signals while installing sound in theaters. Easier than holding a tablet at the top of a ladder.

Another friend suggested that the glove as demonstrated here could be used for art therapy by people with limited movement. I imagine that something similar is in use out there, but the open-source aspect adds another level of customization and possibility, and again, transparency.

I’m looking to experiment with adjusting specific elements of a video clip with something more organic than a slider or knob, and also be able to interact more directly with a projection. I’ve worked with painter Charlie Kemmerer, creating hybrid painting-projections during Ghost Ghost shows. Charlie works on the canvas with a brush, but even standing beside him, I have to work on an iPad at best. Now I can point directly at the surface while selecting, adjusting, and repositioning clips. Or Charlie could wear it while painting to capture his movement, without it getting in the way of holding a brush.

Creative work reflects the nature of your instrument, so it’s exciting to expand the toolset and learn more about the media. Video A-B fades are pretty straight-forward, but the way that the IMU unit works isn’t nearly as predictable as a fader on a board, and I’ve gotten some unexpected results. That’s a good thing.

Even better, I can’t wait to see what other people with these gloves come up with. Tinker, modify, share.

Asteroids Controller with Raspberry Pi

The 1979 arcade version of Atari Asteroids is a beautiful thing.

A white triangle battles lethal polygons on the glowing phosphors of a vector display. The two-note pulse of the music is syncopated by the “pew pew pew” of your effort to stay alive. The controls are as minimal as the graphics, and yet the game itself is complex, never the same twice.

I am a fan.

Years ago, I wired up parts from a scrapped Asteroids machine to control a computer via USB. It’s a beast. So when I had access to the resources at NYU’s ITP Camp, I decided to learn some new tools and make a refined desktop version. This one is designed for a tiny Raspberry Pi computer running an arcade machine emulator (MAME), so all you need to provide is power in and audio/video out.

Asteroids Pi

I’ve included links to design files throughout this page — be sure to open the PDFs in a program that can handle layers. Or, here’s a zipped folder with all the files in one place:

DesktopAsteroidsPlans.zip

The Design

First up, I made vector artwork of the Atari Asteroids control panel by tracing a photo in Illustrator. (I’d actually done this a few years ago for a different project).

measuring

Asteroids Control Art

asteroids_controls.pdf (editable PDF)
asteroids_controls_cs3.ai (Illustrator file)

I decided how small the face could be and still have a comfortable layout, and how tall the controller needed to be to house all the components. From there I sketched up the basic design, and started fabrication with the frame.

The Frame

The acrylic faceplate wraps around 3/4″ plywood end caps, which are connected with two 1.5″ x .5″ plywood rails.

The end pieces have 3/8″ grooves for the acrylic and rails. The CNC machine (robot miller) carved these first, before making a deeper pass to cut the outer shape.

CNC_asteroids_sides.pdf

CNC1
CNC2
CNC3

That done, it needed some good old cabinet art. I made a black and white vector image from the original arcade marquee, and etched that with the laser cutter in raster mode.

asteroidsmarquee_BW_vectors.svg
LASER_asteroids_sideart.pdf

Image link: marquee art
Image link: screen bezel art

marquee
sides etched

Finally, I carved out the rails with a table-mounted router. Nothing computer-driven here, but I did make a diagram. The slots are for removable acrylic shelves.

SHOP_asteroids_rails

The Faceplate

I used 1/4″ clear acrylic for the faceplate to show off the components. The control panel file has layers for raster etching (the artwork) and vector cutting (edges and button holes) with a laser cutter. The plans are flopped so the art is etched on the inside.

There are also 3 shelves that slide into the grooves in the wood. The smaller ones have mounting holes for a breakout board and terminal.

LASER_asteroids_faceplate.pdf
LASER_asteroids_shelves.pdf

LASER_asteroids_faceplate

cut acrylic

ITP has a large strip heater for bending acrylic. I heated up the laser-cut piece along one seam, holding it just above the heater. I flipped it over every minute or so to heat it evenly and prevent warping or blistering.

bending acrylic

Eventually it was soft enough to flex slowly. I put the shorter edge on the table, and holding the sheet evenly, pressed it forward. Holding a piece of wood along the seam helped start the crease straight. I had to re-heat the fold throughout the process, using the heater on the outside and a heat gun along the inside.

I matched the folds to the curve of the plywood end pieces. There was eyeballing involved. It’s not perfect but it’s secure, shiny, and strong.

bent acrylic
assembled1

Assembly and Wiring

I finished the wood with Briwax and glued it all together. The faceplate’s curved shape is enough to hold it in place, but it’s still easy to remove by lifting up on the front edge.

I wired up the buttons with quick disconnects and it looked like this:

final
21_final

The buttons are from an original Asteroids arcade machine, so they’re the old leaf switch type. You can’t get much simpler: a spring-loaded plunger presses two pieces of metal together. This was standard in arcade games until the mid-1980s, when they were replaced with micro-switches. The leaf switch is easier for rapid-fire and finesse. The micro-switch is more compact and goes “click.”

24_final

I scrounged my old buttons at Vintage Arcade Superstore in Glendale, California. The leaf switch buttons were meant for a thin metal plate, and just barely fit around the 1/4″ acrylic. A major supplier of new parts is Suzo-Happ.

I also got two original cone-shaped Atari start buttons. They’re supposed to blink when you insert a coin, but the old parts box I went through didn’t have any with working lights.

A micro-switch inside a hole on the front signals that a coin’s been inserted.

25_final

I wired the buttons to a small Perma-Proto Raspberry Pi Breadboard from Adafruit (details below), which was mounted to the acrylic shelf with 1/4″ standoffs. I also ran wires to a terminal on the other end, in case I ever want to attach the buttons to something other than a Raspberry Pi.

19_final
20_final

When it’s all put together, the Raspberry Pi computer sits on the large center shelf, with access for video, audio, and power cables running out the back.

26_final

Raspberry Pi

The Raspberry Pi is a tiny, inexpensive ($30 or $40, depending on the model), project-friendly Linux computer that came out in early 2013.

Adafruit has an excellent tutorial on setting up a Raspberry Pi to play classic arcade games, written by Phillip Burgess. It leads you through setting up a RasPi from scratch, installing a MAME (Multi Arcade Machine Emulator), and getting arcade buttons to work.

I modified the button wiring plan to suit the Asteroids controls. The buttons are wired to the RasPi like this:

Raspberry Pi pin, Asteroids function
#17, Hyper-Space
MOSI (GPIO 10), Thrust
MISO (GPIO 9), Right
#23, Fire
#24, insert coin
#25 (GPIO 25), Left
#4 (GPIO 4), Two Players
CE1 (GPIO 7), One Player
GND, ground rail

And I changed the appropriate lines of the “retrogame” utility code to this:

struct {
        int pin;
        int key;
} io[] = {
//        Input    Output (from /usr/include/linux/input.h)

        {  17,      KEY_SPACE     },
        {  10,      KEY_LEFTALT     },
        {  9,      KEY_RIGHT     },
        {  23,      KEY_LEFTCTRL     },
        {  24,      KEY_5     },
        {  25,      KEY_LEFT     },
        {  4,      KEY_2     },
        {  7,      KEY_1     }
};

Instructions for doing this are in the tutorial.

At some point I might add two more internal switches to trigger KEY_ESC and KEY_ENTER, to quit and shutdown (using an alias/shortcut) with just the controller.

I had a bit of trouble getting the MAME ROM (game file) to work. In the end, I had to unzip the Asteroids Rev. 2 folder, change the file extensions so the names matched the missing file names that the MAME asked for, re-compress the folder, and put that zipped file into the roms folder on the RasPi. The MAME complains that two of the files are funky, but it seems to run fine.

Finally, once everything was working, I set the RasPi to boot directly into Mame4All when it’s turned on. Instructions are here. Since everything is stored on a cheap SD card, I have one dedicated to Asteroids.

Hello World, Goodbye Asteroid

It works!

I mainly built this to learn some new tools. Maybe next I’ll learn Python and write a new game for five buttons, or maybe I’ll use it to run live visuals.

As for now, it’s time to break out a projector and throw a rooftop Asteroids party.

For more Asteroids love, check out AtariAsteroids.net. (I run the site. Like I said, I’m a fan).

Talking Opera at ITP

I’ve been getting my hands dirty at ITP Camp. NYU Tisch School of the Arts’s Interactive Telecommunications Program is a two-year grad program focused on technology in arts, and Camp is where they let working professionals crash the party for the month of June.

There was a focus on many of the tools I used in Lotus Lives — Max, VDMX, MadMapper, After Effects, laser cutters, etc. Technical workshops are useful, but I always appreciate hearing stories of real-world application. So I gave a presentation about bringing everything together in an actual performance.

The fun part was breaking out the 1:24 scale-model of the concert hall where the premiere performance was staged. I used it during development to help visualize how the projections would fill the space — I’d projected rough versions of the video, but this time I projected the final elements, including a recording of the musicians on stage.

image

I also covered:

– Designing a concept that would be appropriate to the story and feasible with our resources.
– Creating a playback system that could adapt to the performers in a changing, live situation.
– Designing the set for the video, and vice versa.
– Shooting the content: gathering images on location in Malaysia, designing and building shadow puppets (with lasers), and collaborating with dancers.
– Editing and compositing the content.
– Prepping the video for mapping.
– Designing the playback for six projectors, and making it as fail-safe as possible for a live performance.

It’s the first time I’ve covered the breadth of the project at once. I’ve already written up a post on the playback system here, and will cover other elements when I get a chance.