All posts tagged MadMapper

Talking Opera at ITP

I’ve been getting my hands dirty at ITP Camp. NYU Tisch School of the Arts’s Interactive Telecommunications Program is a two-year grad program focused on technology in arts, and Camp is where they let working professionals crash the party for the month of June.

There was a focus on many of the tools I used in Lotus Lives — Max, VDMX, MadMapper, After Effects, laser cutters, etc. Technical workshops are useful, but I always appreciate hearing stories of real-world application. So I gave a presentation about bringing everything together in an actual performance.

The fun part was breaking out the 1:24 scale-model of the concert hall where the premiere performance was staged. I used it during development to help visualize how the projections would fill the space — I’d projected rough versions of the video, but this time I projected the final elements, including a recording of the musicians on stage.


I also covered:

– Designing a concept that would be appropriate to the story and feasible with our resources.
– Creating a playback system that could adapt to the performers in a changing, live situation.
– Designing the set for the video, and vice versa.
– Shooting the content: gathering images on location in Malaysia, designing and building shadow puppets (with lasers), and collaborating with dancers.
– Editing and compositing the content.
– Prepping the video for mapping.
– Designing the playback for six projectors, and making it as fail-safe as possible for a live performance.

It’s the first time I’ve covered the breadth of the project at once. I’ve already written up a post on the playback system here, and will cover other elements when I get a chance.

Ross and Cesar Get Married

For their wedding at Littlefield in Brooklyn last month, Cesar and Ross wanted a DIY affair that would involve their friends and families. We came up with the idea of making a documentary-style video snapshot of their life in New York, to be projected on walls throughout the venue that night.


We shot footage of the grooms walking their dog, hanging out at home, riding the subway and bus, getting haircuts, eating at Mission Chinese, hanging out with friends in bars, etc. They’re Instagram fiends, so I had a blast coloring the footage to give it that social media snapshot look.

At the venue, I went for the smallest footprint possible, so packing up at the end of the night would be quick. The main movie was looped on an outside wall in the venue’s entrance courtyard. There is a window facing the wall, so I mounted the projector inside pointing out. For this, I used one of my home-made mounts, plus mafer clamp, 20″ arm, and knuckle, clamped onto the metal window frame above head-level.


I taped a Roku onto the mount, and created a .m4v file with Handbrake to play back off a USB drive. Roku has a limited number of file types that it will play, and that’s one. Also, the Roku USB player doesn’t have a loop function, so I made a version of the 40 minute movie that plays twice, followed by 15 minutes of black with “CESAR & ROSS” text at the end. I just hit “back” and “play” on the remote every hour and a half. I’m going to pick up a Raspberry Pi to do this better. [UPDATE 5/16/14: I got this running.]


Inside, I had another projector running from my MacBook Pro, up in the sound booth. From there I could fill the entire side wall of the dining/dancing part of the venue. The wall is a black rubberized cork material, not too shiny and not too light-absorbent. With a bright enough projector (mine’s 3,000 lumens), projecting on a dark surface like that gives you high contrast and looks great.

I made both 1280×720 and 854×480 versions of the movies and loops, to pick depending on final projection size. If a particular clip gets shrunk down on the canvas during playback, there’s no need to waste pixels with a large file. But if it’s fillng the screen, I can go for quality. I also compressed everything with VidVox’s new Hap codec, which is open-source and decodes everything on the graphics card. Looked great, played great.


In VDMX (my VJ software of choice), I made a canvas that was 1708 x 960, with quads of 50% scaled-down layers (which were the 854 x 480 media’s native size), including some that were doubled-up for live blending. This went out through Syphon to MadMapper, and I mapped the quads to different parts of the wall: Quads 1 and 2 were the full 40-minute movie with staggered start times, shrunk to two 4-foot wide rectangles on the wall. Behind that was Quad 3, as a giant wall-sized projection. During dinner this was blank or dim, abstract patterns (subway cars passing in the tunnels, blurry street scenes at night…). Quad 4 was just for text (“CESAR & ROSS”) that was projected a few places on the wall.

Once the dance party started, I cleared the smaller images and went full-wall with movie, loops and text, and some color mixing and effects.

I ran the whole show with Touch OSC on my iPhone (some frame-grabs are pictured here: layer-clip assignment, layer blending, and an RGB effect control). I’ve learned not to do this when working with bands: people think I’m some asshole texting during the show. But in this case I didn’t have to leave the dance floor.

Lotus Lives Projection Documentation

The premiere performance of Lotus Lives took place in the Middlebury College Concert Hall, which is a beautiful space, but not built for rigging lights or set. It does haveĀ a curved balcony running around the entire room, and architectural beams that could support high-tension cable for hanging screens. I built a scale model and started from there.


My original plan was to have 3 to 5 projectors spread out around balcony, with one video playback computer per projector. Each computer would run a Max/Jitter patch, with video cues triggered from a networked central control computer. This would allow each system to play back a smaller video file, reducing the chances of slow playback or crashing, and also mean shorter runs of expensive cable.

In the end, I went with the more-eggs-in-fewer-baskets approach, slicing out video from two computers to five projectors, which were almost all within arm’s reach of the control booth. I figured this would keep the different screens in perfect sync, and require fewer separate movies, making them faster to render and easier to manage.

Key to this plan was garageCUBE’s MadMapper projection mapping software. It uses the Syphon framework to allow different Mac applications to share the same space on the graphic card. Mapping is certainly doable within Max/Jitter, but I know that garageCUBE’s Modul8 VJ software has rock-solid under-the-hood performance, and MadMapper’s interface is friendlier than anything I could come up with in the time frame. I downloaded beta version of MadMapper 1 minute after it was released, started using it with VidVox’s VDMX software for live VJ gigs, and loved the results.


My final setup, in order from user input to image output, was this:

1. AKAI APC-20 MIDI controller
2. into a Max patch on a MacBook Pro, which sent the custom MIDI data out to a network, and back to the controller for visual response (the APC-20 only officially plays with Ableton, but the buttons are controlled by a range of MIDI signals — more on that in a separate post).
3. Another Max patch received the MIDI data from the network, and was on every playback computer — in this case, just the same MacBook Pro and a Mac Pro tower, connected with a crossover cable. This patch sent the MIDI signal to VDMX.
4. VidVox’s VDMX for video playback. The programs on each computer were identical, but loaded with different video files. One controller, two (or more) computers.
4a. The media files were on external G-Raid drives. I swear by those. eSATA connection to the MBP, Firewire 800 to the tower (it was an older, borrowed machine).
4b. I used the Apple ProRes422 (not HQ) codec for the movies. They were odd resolutions, larger than SD but smaller than 1920×1080, at 23.976 fps. I usually use motion JPG for VJ work to keep the processor happy, but found that ProRes422 was something that the Macs could handle, with a nice, sharp image.
4c. Several sections included audio playback as well. I went out through an M-Audio firewire box to the sound mixer’s board.
5. Out from VDMX to Syphon
6. From Syphon into MadMapper
7. From MadMapper out to a Matrox TripleHead2Go (digital version) hardware box. The computer sees it as a really wide monitor, but it splits the image out to two or three monitors/projectors.
8. TripleHead2Go to the projectors. The A-computer projectors were a trio of 4000 lumen XGA Panasonics with long lenses, and B-computer projectors were a 5500 lumen WXGA projector and 3000 lumen Optoma on stage, doing rear-projection on a set piece that looked like a dressing room mirror (with border). That was at the end of a 150′ VGA cable run, with VGA amp. Worked well.
9. There was also a 6th projector hooked up to a 3rd computer, which played a stand-alone loop at a key moment in the action. This filled the ceiling with rippling light, turning the visuals up to 11.

The video was broken down into sections ranging from 30 seconds to 5 minutes long. The end of each section had a long tail of looping video, which would play until receiving the MIDI trigger, and then dissolve to the next clip. It would work like this:

Let’s say the fourth song has just begun. Clip 401 is playing in the “A” layer. The video crossfader is on A, so that’s being projected. This is happening on both computers. I press a button on the controller to load clip 402 into the B layer. It’s ready. The performers reach the cue in the music, and I press the GO button. Clip 402 starts playing, and VDMX crossfades from A to B. The crossfade speed is determined by one of the sliders on the controller, ranging from 0 to 5 seconds. Once layer A is clear, I press the eject button and the computer stops playing clip 401. Then I press the button to load clip 403 into A, and standby for the next cue.

In addition to this, I had a few extra layers that could be controlled manually. This way I was able to add new elements during rehearsal, and control some things by hand depending on the feel of the performance.

I found that even with VDMX set to pre-load all video into RAM, the visible layer would skip for a split second when the second movie was triggered, but just on some clips. It turned out that the first 20 or so clips to load when the program was launched would play smoothly, but later ones wouldn’t. This is less of a problem now with SSD playback drives, and maybe with a newer update of VDMX, but I got around it by putting clips with more movement at the top of the media bin.

One other hitch is that the first MacPro tower that I borrowed had two 128 MB graphics cards, but the software could only use one of them. I traded it for a 256 MB card and all was well. Again, not a concern with newer computers, but something to look into if building a system with multiple graphics cards.

All in all, everything worked out well. For future productions, I plan to finish writing my Max/Jitter patch to include playback, and make the eject/load and clip selection process more automatic and fool-proof. Single-button operation, or tied into the lighting board. The MadMapper license is limited to two machines, but like Max (Runtime), VDMX can run a project on any number of machines — the license is only required to make changes and save. All of these programs are fantastic.

Lower East Side Rig

Just wrapped up the premiere of Lotus Lives in Vermont, which had been in the works for years. Now it’s back from the opera to the Lower East Side with Ghost Ghost.

Club shows in New York require quick setup and small footprint. Last night I put the projector and camera together on a tripod, to drop down on the floor by Charlie Kemmerer’s canvas when we were on. He paints during most shows, and we’ve been working on a live video/painting collaboration


The photo doesn’t do it justice, but Charlie’s beast was on fire (actually a waterfall in reverse, red, plus some shimmering from a mylar balloon). I ran both projectors from an iPad with TouchOSC, through VDMX. It’s good to walk around and not spend the show leaning over a laptop. I’m working on getting rid of screens altogether.

I’ve also learned that it’s a bad idea to run OSC from my phone — more portable, but I look like some asshole who’s texting during the show.


Here are a few older pictures of my LES club projector rig in action — it’s a 3,000 lumen Optoma projector mounted on a piece of wood, with a baby wall plate, grip head, C-stand arm, and mafer clamp (plus safety cable when hanging overhead). I run everything through MadMapper since it was released in May, so I just have to point the projector in the general direction, and can square the image up with software.