All posts tagged Lotus Lives

Lotus Lives New Video Player

Earlier this year, I upgraded the programming and staging options for my projections in Su Lian Tan’s opera, Lotus Lives. It had been running on the live performance video software VDMX (which I love), but I wanted to create a more customized setup with an interactive cue sheet and single-button operation. I made a patch in Max, and ran it during a performance in Boston without a hitch.

lotusstage

The cue sheet is a table (Max jit.cellblock object) with the columns:

cue number
description
measure number (in the score)
cue notes (when to trigger the next cue)
duration
active screens
whether the media is a still image or movie

Here is my documentation for the video playback:

Overview

Lotus Lives is a chamber opera for two singers, a brass quintet, and percussion.

Video plays throughout the performance, sometimes acting as the set, and other times taking center stage.

It is designed to be flexible. A basic concert performance uses only one screen, plus audio playback, while the full staging uses multiple projectors with video mapped onto 12 surfaces. And it is possible to stage versions with complexity in between. It should adapt to fit the performance space.

The video is broken into sections ranging from 30 seconds to 5 minutes long. The end of each section has a tail of extra video, which will play until the video operator launches the next clip. This way, the video remains in sync with the live performers, who don’t have to cater their actions to the technology.

The playback software has two parts: the Controller and the Player.

The Controller is like a smart remote control, operated from a single computer to trigger the cues. The Player is the program that actually plays the media clips for projection.

Both can be on the same computer, or it is possible to have Players on multiple computers, one for each projector, controlled from a single Controller over a network.

The software is written in a program called Max (Max/MSP/Jitter). If projecting onto multiple surfaces from one projector, additional video mapping software is needed. Technical details about the software and mapping are below.

It’s also possible to run this media on other performance playback software (Isadora, Resolume, VDMX, modul8, etc.), in which case the fade timing would need to be set according to the cue list.

The Set

stagemodel

lotusmirrorcThe video surfaces are:

(A) a large central screen above or behind the performers.
(B) four banners on either side of this screen (for a total of eight).
(C) a dressing room “mirror” set piece (best as rear-projection).
(D) projection across the width of the stage, onto a handheld scrim during the ballet sequence, and onto the performers as a lighting effect at other times.
(E) projection onto the walls and ceiling of the performance space, to fill the venue with rippling light during the climax of the Folktale.

The video is meant to be immersive, and the size and placement of the surfaces can be tailored to each production. The only thing that needs to be maintained is the aspect ratio of each surface, and relative distance between the banners.

The aspect ratios are:

(A) 1.78:1 (16:9)
(B) 1:4 for each banner, to be spaced 1/2 the banners’ width from each other, four on each side of surface A.
(C) 1.14:1, which is a 1:1 square with an additional border on the left, right, and top.
(D) 4:1. The handheld scrim should be a white or gray mesh suitable for projections, about 7′ high and the width of the stage, or at least 30′.
(E) This is an abstract rippling texture meant to fill as much of the performance space’s ceiling and walls as possible at one point during the Folktale. While the source movie is 16:9, the projected aspect ratio does not matter.

The Media

The “media” folder contains QuickTime movies and audio for playback. These use the ProRes 422-LT codec, which has a lower data rate than the master clips (saved as ProRes 422-HQ) but maintains quality.

There is also an audio folder which contains .aif audio files, which are to be updated with recordings by the performers. See “Setting Up Audio Clips” below for details.

There are four versions of the video, which are configurations for different projection setups.

V1: This is for running projections from multiple networked computers. There is one screen per video, with the exception of surface B.

For surface B, all eight banners are composited onto this video, so it will need to be sliced up with mapping software.

V2: This is the version for one screen only. Critical elements that would be lost by eliminating surfaces B-E are included on this single, main-screen video.

V3: This has all the surfaces composited onto one large movie, to be mapped onto multiple screens from one projector, or multiple projectors from one computer.

V4: This is surfaces A and E composited into one movie, since it’s likely that a single projector can be used for both surfaces. Mapping is required.

I have prepared and included MadMapper files for V1-B, V3, and V4.

Setting Up the Computer

While Max runs on Mac or Windows, I have only tested this patch for Mac. Additionally, the output for mapping with MadMapper uses a Mac-only framework, called Syphon.

You will need to install:

Max (version 7 or later)
Max is free if using it for running files like the Lotus Controller and Player. A paid license is only needed for saving changes, after the free trial period.

– Apple ProRes codec
Probably installed on any Mac with QuickTime; also available for download from Apple.

If mapping the video output:

Syphon for Jitter
Syphon is a Mac OS framework that allows multiple applications to share the same space in video memory, without a hit to performance. This is how the video gets to the mapping software.

To install Syphon, unzip the package, then move the Syphon folder into Users/[user]/Documents/Max 7/Packages

– Mapping software of choice
I use MadMapper.
It does require a paid license, but it’s easy to use and runs beautifully. There are other options (search for “projection mapping software”). Max can also handle mapping, although this Player isn’t set up for it.

Setting Up Audio Clips:

In addition to movies 301 and 401, which have stereo audio tracks, there are four more separate audio clips that will play back in sync with the video. These are recordings of the performers, and need to be prepared for each production.

The reference folder of the Lotus hard drive contains QuickTime movies of the subtitled narration, which can be read karaoke-style for exact timing.

Once the new audio files are placed in the media/audio folder of the playback drive, with the specified file name, the Player will play them back at the correct point during the performance.

The Lotus Player

This runs the video and audio for Lotus Lives, controlled by the Lotus Lives Controller. It should be on the computer that’s hooked up to the projector.

Double-clicking Lotus Player.maxpat will launch Max, and open the Player.

lotusplayer2

SETUP:

1. Select which surface video you want to run.

2. Click CHECK FILE PATHS to make sure the Player can find the media. If the media is on a drive other than “lotus,” click Set Movie / Audio File Path and find the folder with the media.

3. If the Controller is on the same computer, leave “controller” set to “local.” If it’s on a different computer on the same network, select “network.” Be sure “network” is selected on the Controller too.

4. Set the video output:

4a. If projecting directly from the Player, move the “projector” window to the projector display. If the projector is attached when launching the Player, the “projector” window will already be on the second display.

4b. If mapping the video output with a program that uses the Syphon framework (like MadMapper), select “Syphon,” then launch the program and use that for display.

5. Test the audio, and set levels for the individual clips. From the Controller, select cue 301 or 401 for movies with audio. Press “play” below the levels sliders on the Player for the additional clips.

5a. The audio clip levels will not save when the Player is closed, but you can make note of the numerical setting, and adjust it the next time you launch the Player.

5b. The beat in 601 should be played live, so by default it will not play; but it can be cued for playback too by selecting the toggle next to the levels slider.

OTHER CONTROLS:

window visible – toggles whether the “projector” window is visible. Turns off if Syphon is selected.

video render – refreshes the video screen. Video will not appear to play if this is off.

audio – turns audio playback on and off.

video fullscreen – toggles whether the “projector” window is fullscreen. Also activated by the escape key.

hide menubar & cursor on fullscreen – use this option if presenting the window on the same screen as the Player, ie. if the projector is the only display.

Load Calibration Grid – this will load a calibration grid for the selected surface.

play, pause, restart, eject – controls playback of the video in either bank.

slider – A/B fade. Operates automatically when the GO button is triggered on the Controller.

“X” toggle next to audio sliders – enables or disables individual audio clips.

play for audio sliders – manual playback of audio clips, for testing purposes.

The Lotus Controller

This controls the video Player(s), which can be on this computer, or networked over several different computers.

Double-clicking Lotus Controller.maxpat will launch Max, and open the Controller.

lotuscontroller2

TO RUN THE SHOW:

1. Launch the Controller and the Player(s)

2. Set the settings on the Player(s)

3. START THINGS RUNNING by pressing the “Run” button

4. Go to the first cue by pressing “go to beginning of show,” or the GO button several times, until the CURRENT Cue # is “1 – BLACK”

5. Press GO or the space bar to trigger the next cue

Duration is an estimated countdown to the next cue. Actual time will vary depending on the performance, but it will let you know when to be ready.

Also keep an eye on Cue Notes, which is a description of when the next cue occurs.

OTHER CONTROLS:

Black – toggles a fast fade to / from black, and pauses the active movie.

Grid – activates a calibration grid on all Players.

CURRENT Cue # and Description – what’s playing now.

NEXT Cue # and Description – what’s cued up to play when GO is pressed. NEXT Cue # is a dropdown menu, so you can jump directly to any cue.

Fade is the duration of the crossfade from current to next clips. This can be adjusted manually, but will automatically set according to the cue list.

Measure – The measure of the next cue in the score.

play – plays the active movie

pause – pauses the active movie

restart – goes to the beginning of the active movie

eject – clears the active movie from the Player

previous and next move forward and backward through the next cue to be loaded.

go to beginning of show
– loads the first cue up next

open cue list – this is the cue sheet in table form, which is where all the playback data is stored. Editing this will affect the show’s playback.

Local / Network – If the Controller and the Player are on the same computer, keep the lower-right setting on local. If networking several computers, select network on the Controller and all Players. It is recommended to have a dedicated network, wired if possible.

Talking Opera at ITP

I’ve been getting my hands dirty at ITP Camp. NYU Tisch School of the Arts’s Interactive Telecommunications Program is a two-year grad program focused on technology in arts, and Camp is where they let working professionals crash the party for the month of June.

There was a focus on many of the tools I used in Lotus Lives — Max, VDMX, MadMapper, After Effects, laser cutters, etc. Technical workshops are useful, but I always appreciate hearing stories of real-world application. So I gave a presentation about bringing everything together in an actual performance.

The fun part was breaking out the 1:24 scale-model of the concert hall where the premiere performance was staged. I used it during development to help visualize how the projections would fill the space — I’d projected rough versions of the video, but this time I projected the final elements, including a recording of the musicians on stage.

image

I also covered:

– Designing a concept that would be appropriate to the story and feasible with our resources.
– Creating a playback system that could adapt to the performers in a changing, live situation.
– Designing the set for the video, and vice versa.
– Shooting the content: gathering images on location in Malaysia, designing and building shadow puppets (with lasers), and collaborating with dancers.
– Editing and compositing the content.
– Prepping the video for mapping.
– Designing the playback for six projectors, and making it as fail-safe as possible for a live performance.

It’s the first time I’ve covered the breadth of the project at once. I’ve already written up a post on the playback system here, and will cover other elements when I get a chance.

Lotus Lives Projection Documentation

The premiere performance of Lotus Lives took place in the Middlebury College Concert Hall, which is a beautiful space, but not built for rigging lights or set. It does have a curved balcony running around the entire room, and architectural beams that could support high-tension cable for hanging screens. I built a scale model and started from there.

CFA-model

My original plan was to have 3 to 5 projectors spread out around balcony, with one video playback computer per projector. Each computer would run a Max/Jitter patch, with video cues triggered from a networked central control computer. This would allow each system to play back a smaller video file, reducing the chances of slow playback or crashing, and also mean shorter runs of expensive cable.

In the end, I went with the more-eggs-in-fewer-baskets approach, slicing out video from two computers to five projectors, which were almost all within arm’s reach of the control booth. I figured this would keep the different screens in perfect sync, and require fewer separate movies, making them faster to render and easier to manage.

Key to this plan was garageCUBE’s MadMapper projection mapping software. It uses the Syphon framework to allow different Mac applications to share the same space on the graphic card. Mapping is certainly doable within Max/Jitter, but I know that garageCUBE’s Modul8 VJ software has rock-solid under-the-hood performance, and MadMapper’s interface is friendlier than anything I could come up with in the time frame. I downloaded beta version of MadMapper 1 minute after it was released, started using it with VidVox’s VDMX software for live VJ gigs, and loved the results.

control-setup

My final setup, in order from user input to image output, was this:

1. AKAI APC-20 MIDI controller
2. into a Max patch on a MacBook Pro, which sent the custom MIDI data out to a network, and back to the controller for visual response (the APC-20 only officially plays with Ableton, but the buttons are controlled by a range of MIDI signals — more on that in a separate post).
3. Another Max patch received the MIDI data from the network, and was on every playback computer — in this case, just the same MacBook Pro and a Mac Pro tower, connected with a crossover cable. This patch sent the MIDI signal to VDMX.
4. VidVox’s VDMX for video playback. The programs on each computer were identical, but loaded with different video files. One controller, two (or more) computers.
4a. The media files were on external G-Raid drives. I swear by those. eSATA connection to the MBP, Firewire 800 to the tower (it was an older, borrowed machine).
4b. I used the Apple ProRes422 (not HQ) codec for the movies. They were odd resolutions, larger than SD but smaller than 1920×1080, at 23.976 fps. I usually use motion JPG for VJ work to keep the processor happy, but found that ProRes422 was something that the Macs could handle, with a nice, sharp image.
4c. Several sections included audio playback as well. I went out through an M-Audio firewire box to the sound mixer’s board.
5. Out from VDMX to Syphon
6. From Syphon into MadMapper
7. From MadMapper out to a Matrox TripleHead2Go (digital version) hardware box. The computer sees it as a really wide monitor, but it splits the image out to two or three monitors/projectors.
8. TripleHead2Go to the projectors. The A-computer projectors were a trio of 4000 lumen XGA Panasonics with long lenses, and B-computer projectors were a 5500 lumen WXGA projector and 3000 lumen Optoma on stage, doing rear-projection on a set piece that looked like a dressing room mirror (with border). That was at the end of a 150′ VGA cable run, with VGA amp. Worked well.
9. There was also a 6th projector hooked up to a 3rd computer, which played a stand-alone loop at a key moment in the action. This filled the ceiling with rippling light, turning the visuals up to 11.

The video was broken down into sections ranging from 30 seconds to 5 minutes long. The end of each section had a long tail of looping video, which would play until receiving the MIDI trigger, and then dissolve to the next clip. It would work like this:

Let’s say the fourth song has just begun. Clip 401 is playing in the “A” layer. The video crossfader is on A, so that’s being projected. This is happening on both computers. I press a button on the controller to load clip 402 into the B layer. It’s ready. The performers reach the cue in the music, and I press the GO button. Clip 402 starts playing, and VDMX crossfades from A to B. The crossfade speed is determined by one of the sliders on the controller, ranging from 0 to 5 seconds. Once layer A is clear, I press the eject button and the computer stops playing clip 401. Then I press the button to load clip 403 into A, and standby for the next cue.

In addition to this, I had a few extra layers that could be controlled manually. This way I was able to add new elements during rehearsal, and control some things by hand depending on the feel of the performance.

I found that even with VDMX set to pre-load all video into RAM, the visible layer would skip for a split second when the second movie was triggered, but just on some clips. It turned out that the first 20 or so clips to load when the program was launched would play smoothly, but later ones wouldn’t. This is less of a problem now with SSD playback drives, and maybe with a newer update of VDMX, but I got around it by putting clips with more movement at the top of the media bin.

One other hitch is that the first MacPro tower that I borrowed had two 128 MB graphics cards, but the software could only use one of them. I traded it for a 256 MB card and all was well. Again, not a concern with newer computers, but something to look into if building a system with multiple graphics cards.

All in all, everything worked out well. For future productions, I plan to finish writing my Max/Jitter patch to include playback, and make the eject/load and clip selection process more automatic and fool-proof. Single-button operation, or tied into the lighting board. The MadMapper license is limited to two machines, but like Max (Runtime), VDMX can run a project on any number of machines — the license is only required to make changes and save. All of these programs are fantastic.