All posts tagged Max

Lotus Lives New Video Player

Earlier this year, I upgraded the programming and staging options for my projections in Su Lian Tan’s opera, Lotus Lives. It had been running on the live performance video software VDMX (which I love), but I wanted to create a more customized setup with an interactive cue sheet and single-button operation. I made a patch in Max, and ran it during a performance in Boston without a hitch.

lotusstage

The cue sheet is a table (Max jit.cellblock object) with the columns:

cue number
description
measure number (in the score)
cue notes (when to trigger the next cue)
duration
active screens
whether the media is a still image or movie

Here is my documentation for the video playback:

Overview

Lotus Lives is a chamber opera for two singers, a brass quintet, and percussion.

Video plays throughout the performance, sometimes acting as the set, and other times taking center stage.

It is designed to be flexible. A basic concert performance uses only one screen, plus audio playback, while the full staging uses multiple projectors with video mapped onto 12 surfaces. And it is possible to stage versions with complexity in between. It should adapt to fit the performance space.

The video is broken into sections ranging from 30 seconds to 5 minutes long. The end of each section has a tail of extra video, which will play until the video operator launches the next clip. This way, the video remains in sync with the live performers, who don’t have to cater their actions to the technology.

The playback software has two parts: the Controller and the Player.

The Controller is like a smart remote control, operated from a single computer to trigger the cues. The Player is the program that actually plays the media clips for projection.

Both can be on the same computer, or it is possible to have Players on multiple computers, one for each projector, controlled from a single Controller over a network.

The software is written in a program called Max (Max/MSP/Jitter). If projecting onto multiple surfaces from one projector, additional video mapping software is needed. Technical details about the software and mapping are below.

It’s also possible to run this media on other performance playback software (Isadora, Resolume, VDMX, modul8, etc.), in which case the fade timing would need to be set according to the cue list.

The Set

stagemodel

lotusmirrorcThe video surfaces are:

(A) a large central screen above or behind the performers.
(B) four banners on either side of this screen (for a total of eight).
(C) a dressing room “mirror” set piece (best as rear-projection).
(D) projection across the width of the stage, onto a handheld scrim during the ballet sequence, and onto the performers as a lighting effect at other times.
(E) projection onto the walls and ceiling of the performance space, to fill the venue with rippling light during the climax of the Folktale.

The video is meant to be immersive, and the size and placement of the surfaces can be tailored to each production. The only thing that needs to be maintained is the aspect ratio of each surface, and relative distance between the banners.

The aspect ratios are:

(A) 1.78:1 (16:9)
(B) 1:4 for each banner, to be spaced 1/2 the banners’ width from each other, four on each side of surface A.
(C) 1.14:1, which is a 1:1 square with an additional border on the left, right, and top.
(D) 4:1. The handheld scrim should be a white or gray mesh suitable for projections, about 7′ high and the width of the stage, or at least 30′.
(E) This is an abstract rippling texture meant to fill as much of the performance space’s ceiling and walls as possible at one point during the Folktale. While the source movie is 16:9, the projected aspect ratio does not matter.

The Media

The “media” folder contains QuickTime movies and audio for playback. These use the ProRes 422-LT codec, which has a lower data rate than the master clips (saved as ProRes 422-HQ) but maintains quality.

There is also an audio folder which contains .aif audio files, which are to be updated with recordings by the performers. See “Setting Up Audio Clips” below for details.

There are four versions of the video, which are configurations for different projection setups.

V1: This is for running projections from multiple networked computers. There is one screen per video, with the exception of surface B.

For surface B, all eight banners are composited onto this video, so it will need to be sliced up with mapping software.

V2: This is the version for one screen only. Critical elements that would be lost by eliminating surfaces B-E are included on this single, main-screen video.

V3: This has all the surfaces composited onto one large movie, to be mapped onto multiple screens from one projector, or multiple projectors from one computer.

V4: This is surfaces A and E composited into one movie, since it’s likely that a single projector can be used for both surfaces. Mapping is required.

I have prepared and included MadMapper files for V1-B, V3, and V4.

Setting Up the Computer

While Max runs on Mac or Windows, I have only tested this patch for Mac. Additionally, the output for mapping with MadMapper uses a Mac-only framework, called Syphon.

You will need to install:

Max (version 7 or later)
Max is free if using it for running files like the Lotus Controller and Player. A paid license is only needed for saving changes, after the free trial period.

– Apple ProRes codec
Probably installed on any Mac with QuickTime; also available for download from Apple.

If mapping the video output:

Syphon for Jitter
Syphon is a Mac OS framework that allows multiple applications to share the same space in video memory, without a hit to performance. This is how the video gets to the mapping software.

To install Syphon, unzip the package, then move the Syphon folder into Users/[user]/Documents/Max 7/Packages

– Mapping software of choice
I use MadMapper.
It does require a paid license, but it’s easy to use and runs beautifully. There are other options (search for “projection mapping software”). Max can also handle mapping, although this Player isn’t set up for it.

Setting Up Audio Clips:

In addition to movies 301 and 401, which have stereo audio tracks, there are four more separate audio clips that will play back in sync with the video. These are recordings of the performers, and need to be prepared for each production.

The reference folder of the Lotus hard drive contains QuickTime movies of the subtitled narration, which can be read karaoke-style for exact timing.

Once the new audio files are placed in the media/audio folder of the playback drive, with the specified file name, the Player will play them back at the correct point during the performance.

The Lotus Player

This runs the video and audio for Lotus Lives, controlled by the Lotus Lives Controller. It should be on the computer that’s hooked up to the projector.

Double-clicking Lotus Player.maxpat will launch Max, and open the Player.

lotusplayer2

SETUP:

1. Select which surface video you want to run.

2. Click CHECK FILE PATHS to make sure the Player can find the media. If the media is on a drive other than “lotus,” click Set Movie / Audio File Path and find the folder with the media.

3. If the Controller is on the same computer, leave “controller” set to “local.” If it’s on a different computer on the same network, select “network.” Be sure “network” is selected on the Controller too.

4. Set the video output:

4a. If projecting directly from the Player, move the “projector” window to the projector display. If the projector is attached when launching the Player, the “projector” window will already be on the second display.

4b. If mapping the video output with a program that uses the Syphon framework (like MadMapper), select “Syphon,” then launch the program and use that for display.

5. Test the audio, and set levels for the individual clips. From the Controller, select cue 301 or 401 for movies with audio. Press “play” below the levels sliders on the Player for the additional clips.

5a. The audio clip levels will not save when the Player is closed, but you can make note of the numerical setting, and adjust it the next time you launch the Player.

5b. The beat in 601 should be played live, so by default it will not play; but it can be cued for playback too by selecting the toggle next to the levels slider.

OTHER CONTROLS:

window visible – toggles whether the “projector” window is visible. Turns off if Syphon is selected.

video render – refreshes the video screen. Video will not appear to play if this is off.

audio – turns audio playback on and off.

video fullscreen – toggles whether the “projector” window is fullscreen. Also activated by the escape key.

hide menubar & cursor on fullscreen – use this option if presenting the window on the same screen as the Player, ie. if the projector is the only display.

Load Calibration Grid – this will load a calibration grid for the selected surface.

play, pause, restart, eject – controls playback of the video in either bank.

slider – A/B fade. Operates automatically when the GO button is triggered on the Controller.

“X” toggle next to audio sliders – enables or disables individual audio clips.

play for audio sliders – manual playback of audio clips, for testing purposes.

The Lotus Controller

This controls the video Player(s), which can be on this computer, or networked over several different computers.

Double-clicking Lotus Controller.maxpat will launch Max, and open the Controller.

lotuscontroller2

TO RUN THE SHOW:

1. Launch the Controller and the Player(s)

2. Set the settings on the Player(s)

3. START THINGS RUNNING by pressing the “Run” button

4. Go to the first cue by pressing “go to beginning of show,” or the GO button several times, until the CURRENT Cue # is “1 – BLACK”

5. Press GO or the space bar to trigger the next cue

Duration is an estimated countdown to the next cue. Actual time will vary depending on the performance, but it will let you know when to be ready.

Also keep an eye on Cue Notes, which is a description of when the next cue occurs.

OTHER CONTROLS:

Black – toggles a fast fade to / from black, and pauses the active movie.

Grid – activates a calibration grid on all Players.

CURRENT Cue # and Description – what’s playing now.

NEXT Cue # and Description – what’s cued up to play when GO is pressed. NEXT Cue # is a dropdown menu, so you can jump directly to any cue.

Fade is the duration of the crossfade from current to next clips. This can be adjusted manually, but will automatically set according to the cue list.

Measure – The measure of the next cue in the score.

play – plays the active movie

pause – pauses the active movie

restart – goes to the beginning of the active movie

eject – clears the active movie from the Player

previous and next move forward and backward through the next cue to be loaded.

go to beginning of show
– loads the first cue up next

open cue list – this is the cue sheet in table form, which is where all the playback data is stored. Editing this will affect the show’s playback.

Local / Network – If the Controller and the Player are on the same computer, keep the lower-right setting on local. If networking several computers, select network on the Controller and all Players. It is recommended to have a dedicated network, wired if possible.

Standup Reminder

I rarely sit down from call time to wrap during film production. But when editing, the reverse is true, and I find that’s far worse for the body.

Here’s a reminder program I built to run in the corner of my desktop.

standup1

When the blue dial lights up (every 12 minutes), I drink water; the green dial (30 minutes), I stand up and stretch for a moment; yellow (55 minutes), I take a walk and look outside.

standup2

Times are adjustable. If you ignore a reminder and it goes around twice, it turns red. Click on the dial to reset.

It works!

The Standup Reminder uses a few simple UI elements in the graphical programing language Max. I was able to export it as a standalone Mac App for anyone who wants to give it a go, although it’s 73 MB for some reason (the price of using Max for something so simple). Download that here.

If you have Max, download the patch here.

Or copy and paste this into a patcher window:


———-begin_max5_patcher———-
2321.3oc2cszbiaiD9r7uBTLUtrkGs3Eekcusm2sRNmJ0TTjP1XBEoJ9vimj
J+2WP.RIJOR1fRtQAGOiDsfDI95Fcit+Z.Q+m2sJXS8yh1.zOg9UzpU+4cqV
oaZngUiudUvtrmyKyZ0erfch11rGDA2aduNwyc51+prpn9qnV4eHPDLV+Hhi
QLbz8nw2T7rHe5DKkUh759J8YSGartuqTz08s8BCjBBP+13aIKzcS8lu7onv
oKRa22J0ezfoVp52IqTWi14WVUilqrtUxXq6y5xeTV8vmaD4cl9KMNYM9dDM
MVefjNbfEtFi9sgy4ut6tgmt2RUUk3qJ39cZpx5rhMYUODbIgV+lmSviirTv
IWqfGoODg0ufdkxcd8tcB0.6KE7+8m9Dp6QYKR8+9VQAZacCZSurrPAFT198
kx7rNYcER8Akcn7rJzFApPnD.0mVtE8k91NTSeUk9DZQYHsrfjUn+a1ym0zh
eF0X3MoFwWVMRvgZSFRHec3vgvI046oAzfNT24hlKZCEb+kbdX2jvSeSaHFW
qBhROZCM0gYOMAue3+M.vSDDTv+Z3oe3WFdVTUXZ48Tu8OPzKpujCVKmScgg
ZtlXLSqtHFmszAClq1k6BRbqx0IuCcTHN+bMWzVgbSB+qXqDEwFjWFOb3PX5
MMeykBIs4g75R0TLDDV8ORvxBxPgZfeR1i0xd5sMW66nodXLTRLKJRKwdrod
XBTl5LZrYZQu0TOLErA9QY26L0ifRhC0F2pjd8VKcNEJKcJ1XnS7UCcVBTC6
ihdLAN671trphrx5pCJk4RVLTozllvmkQ6Expq3yJPnNkOm000H2z2Y3Tt5f
DtJHqrzvBrMKuSZRDbraUuYegrtse+95ltYnToo5qJJExBEcB4VoJMQc9uxc
axZzC3qqDS7LVEn3KzkWV2J5p2WJdRTNlaY6I8UtX6rd5Xy06+14ZuHqKaSV
6o3UMlHpZULUZOAsOnZ7LWhcxB44Dtp5IpfEhsBEymAZeyNu5mDMEMuTWsuQ
rcqrTTksyLNqX9fhQ+rpYQinJWo5m9nshrl7GUrr1IaaUW8gS6Tkgxnpqu8I
YqbS4ocihfl4zUJwG0ui9Mz1xuSVzMnxMKLRksL2vKNoLp1LO1XeGCgC7UHt
bnDWZJaMS8Cm6axLOBJYlsNc3GBrRb6LIdtXEB7DzpfNQC+v0oc.ffUHJy9F
JDiWdcq3bvJXmQ3C0LphXPH2We453LvJWmQro5zM.pZcJ7oiiOqccpWFDfWm
DQi4ptW8aw3vjDyukvYJGLxr7ClOe1Aams0UcayxEykyuWIoB00pfltlfyjd
s7RRn5xLQMuXVGdVU6PGdHh4+otuQdr3YV6pkDoSnmfiuPWOGv5d+8jlyqjv
+E44PfxwKgDqUELMQGFEJOuAd.nig8sLtIEJGuQw1D.I9Fmmse2liFg6yZTl
mchlOqxr6zjwNe0cu3zNguxzNWvihmN2mJD+cI4eaUFdxXw32blKuu42vAqT
X7Dt67aX3El6UBT9MihsW62vSWreyXEMgxuYxX4iieCXwaHQZGF232PnKzuA
CkeynX609Mrk62PXf52LYr.peS+l9tN0o+VEg8PwXu+x9MGbadLqpnd6Vql6
wj2+nh73gknC4uRAQ3LSc+Fd176LGkp6OdMS8.VE8iiIF6G8ZXvR.onA2VI8
iAaYZ4rYxN3kzmnrloKsn9TvxUaR5CSLylfAU7wqiFevWrJHEXU.IQWfEBEl
U1HOqTfvJy+gkz433okKpisAeitVOey3e5sY8WHyJupXuaUokLaJutGk4+dk
vbUM1jihkPTnlE4XIZzf+jmNq56LorUH1l0WdnjPiVnSEeI59om3mdUGftdq
WM6COEgB+RHTHdnQXVMBUjL3C0Yw3rYW+QHjCGbQzt+4vn3xi2w.b+ojLK6I
NGjY8Z6p2uP+bFb6DKxw0CfARQjyKqy+cQyqOT+BO84xNG3Aa+lfQHTaYwIS
cCqRfJjtgm.5QQi3U27yyk3HnVsno.55BxwCmr3uJA98jAT5Uv.ZLT.TLfXD
ipJ8CBCHBXaUUN26o.Q.aqpNZF3FNPpLfWmpdvVJE.BXoCLI+irfHPyBJbjE
T3hUALfUA9LKHRBTrflb9+aMKHJdorfBOvBJzQrfdOB2Yw37GJVPTvVvAFyK
YAQgaddCKXFwaYAQY.OX60rfnbnXAMYp6arfnfsm4lxog3YrfHwWAMHPIAMl
qC6CBGHvpSBgl36TfXv985vQDfVSMKBzhWCDf+tc3T9OFNPKUCDAqFvqo+jB
E8GBG+2e5Ojnqh9yDGHmP+ATxOSixenH+P.69fgIkHei6CArn6i24BB8VpOI
vNR626mYn38LZk6czdfh0yThLuBoGs9WivWbSARe4FZ+TcPaceS9zn3353nx
c6376scxpCSY9qG4wiNNw5ixhBQ0brWHaGLYJN70N76FLdOgCwcvI4jd5BvI
zYvY39IzahmHGhGKfCM1Yv4ER9ETOty5Y3VRxahmg8yiqvCwF8C0c3AaCdHm
Lk.n3wlgKryTOC2lOda2KGhGalM7EfFT7XyzOZP6FymPareFtad3J7XyzgC6
TIWMdYCdnNz9gaCdX9EdbXxOgLa7ubW3qPaBmF5P8iMgSicm6tUQ2w9k5Ixu
TO5n6T2fGaCVf8G33UnwcN5baRDi6NdNbaRDi6t.obqRDK1uvyvFvxQt51gG
2wywN7fcGdrIw4D2Y9XCbnQtCO1jWH2cU8gaSdgb2ErfaUYVbn8iMYZDF4W3
g6NuchkA28H3Lrb0tJXgMEEi4vf61pdbCbXVkKl6HkxrZIBbmyNKzRRodDbh
HdFdbGbrw6h4NVfLleUxPlUg1cWpOLreshN1LbQcWlOzT+R8P8rImowdl9Ix
yzObaWQGOBODhmgG2UhWpUy+3P8C0uVQGpMbKntKbJI0u72II90JbShsX6Gk
3WvwgEhhXSxytKYLB2qJaHg4YFyTOCOVUnE2w0gfscUBbDdrvY2gTurANW8l
0vrsUy1u+IQS630TijfcYew78PH9d8KkUlWp22sAMhg+zWX97g5VF9KdgrSj
202X15sOmX9tQDrqtPzT0KG26+pd9ut6+Cj8ZMvD
———–end_max5_patcher———–

Gloves Video Controller

Six of us at NYU’s ITP Camp decided to follow The Gloves Project’s patterns to build our own gloves in June. These are sensor-laden gloves that can be used to control software through hand gestures. Our group included musicians, a theatrical sound designer, a gamer, and visualists, each with different uses for the glove in mind.

To get an idea of how it can be used with video in a live setting, take a look at this test clip, where I use hand movement to wirelessly control video playback and effects.

Here, sensor values on the glove are sent via Bluetooth to a decoder patch written in Max, and then out as MIDI controller data to VDMX, VJ software. It works!

Gloves have been used as controllers in live performance for some time — see Laetita Sonami’s Lady’s Glove for example. Our particular design is based on one created for Imogen Heap to use as an Ableton Live controller, so she can get out from behind a computer or keyboard and closer to the audience. She gives a great explanation and demonstration at this Wired Talk (musical performance starts at 13:30).

Heap and The Gloves Project team are into sharing the artistic possibilities of this device with others, as well as increasing the transparency of the musical process which can be obscured inside a computer. This is an attitude I’ve believed in since attending MakerFaire and Blip Festival in 2009, where I saw a range of homemade controllers and instruments. I was much more engaged with the artists who made the causal process visible. It doesn’t have to be all spelled-out, but in certain cases it helps to see the components: the performer is making the things happen. This is obvious with a guitar player, but not so much with electronic music. Also, you get a different creative result by moving your arms than pressing a button — a violin is different from a piano.

The Gloves Project has a residency program where they’ll loan a pair of gloves to artists, plus DIY plans for an Open Source Hardware version. The six of us at ITP Camp built one right-hand glove each. We had to do a bit of deciphering to figure everything out, but we had a range of skills between us and got there in the end.

Each glove has six flex sensors in the fingers (thumb and ring finger have one each, and index and middle have two each, on the upper and lower knuckle), which are essentially resistors: the more they bend, the less electricity passes through. This can be measured and turned into a number. The sensors run to a tiny programmable ArduIMU+ v3 board by DIYDrones, which uses Arduino code and includes a built-in gyroscope, accelerometer, and magnetometer (a compass if you attach a GPS unit for navigation). This is mostly used for flying things like small self-guided airplanes, but also works for motion capture. We make a serial connection to the computer with a wireless bluetooth device.

Here’s a wiring guide that we drew up.

We had more trouble with the software side of things. The Gloves Project designed is to communicate with their Glover software, written in C++ by Tom Mitchel. There are instructions on the website, but we couldn’t reach anyone to actually get a copy of the program. In the end, we copied the flex sensor sections of Seb Madgwick’s ArduIMU code and used it to modify the ArduIMU v3 code. It delivered a stream of numbers, but we still had to figure out how to turn it into something we could use.

We formatted the output sensor data like this:

Serial.println("THUMB:");
Serial.println(analogRead(A0));
Serial.println("INDEXLOW:");
Serial.println(analogRead(A1));
Serial.println("INDEXUP:");
Serial.println(analogRead(A2));

…and so on. I then programmed a patch in Max to sort it out.

Details:

When one of the sensors’ name comes through, Max routes it to a specific switch, opens the switch, lets the next line through (the data for that sensor), and then closes the switch. Data goes where we want, and garbage is ignored.

Every glove and person is slightly different, so next the glove is calibrated. Max looks for the highest and lowest number coming in, and then scales that to the range of a MIDI slider: 0 to 127. When you first start the decoder, you move your hand around as much as you can and voilà! It’s set.

I made the default starting point for flex sensor data 400, since the lowest point sometimes didn’t fall below 0, while the peak was always above 400. The starting point for movement data is 0. There’s also a “slide” object that smooths movement so it doesn’t jump all over the place while still being fairly responsive.

The number is now sent through a Max “send” object with a different name than the raw sensor data. If you’re keeping everything inside Max, you can just set up a corresponding “receive” object.

Otherwise, it gets turned into a MIDI control or note value, and sent out through a local MIDI device or over a network.

Finally, I tidied everything up so it’s useable in presentation mode. Anyone can download the patch and run it in Max Runtime (free).

There are probably more efficient ways of doing this, but it’s our first pass to get things working.

To download all our code, visit https://github.com/timpear/ITP-Gloves/

Since finishing that, I discovered that The Gloves Project has released a whole range of decoders / bridges in various languages. Their ArduIMU code has lots of clever deciphering on the gloves end of things, and the bridges primarily output OSC instead of MIDI, which is handy. Beyond that, The Gloves Project continues to develop new versions of gloves, and are worth checking up on.

Our decoder simply translates the raw sensor data. The next step is to get it to recognize hand gestures, and trigger specific events or adjust values based on that (which is what the Glover software does). We also need to program the glove’s RGB LED and vibration motor for feedback from the computer.

I showed this project to Karl Ward (rock star, Ghost Ghost collaborator, masters student at ITP), and it turns out that he’s currently working on an Arduino library to do a lot of this work, only more elegantly, within the controller. The first library is Filter, which he augmented over the summer to require another new library he wrote, called DataStream. He says: “They are both in usable, tested shape, but the API is still in flux. Right now I’m looking for folks who have Arduino code that does its own filtering, or needs filtering, so I can design the API to fit the most common cases out there.” We’re going to jam.

The glove has all sorts of possible artistic applications, but what else? When I showed it to my dad, he wondered if it could be used as a translator for sign language. Brilliant. It sounds like Microsoft is currently developing software for the Xbox One and new Kinect that will do this, although one advantage of a wearable controller in any case is the ability to get away from a computer (within wireless range). One of the people on our team is going to use it to adjust audio signals while installing sound in theaters. Easier than holding a tablet at the top of a ladder.

Another friend suggested that the glove as demonstrated here could be used for art therapy by people with limited movement. I imagine that something similar is in use out there, but the open-source aspect adds another level of customization and possibility, and again, transparency.

I’m looking to experiment with adjusting specific elements of a video clip with something more organic than a slider or knob, and also be able to interact more directly with a projection. I’ve worked with painter Charlie Kemmerer, creating hybrid painting-projections during Ghost Ghost shows. Charlie works on the canvas with a brush, but even standing beside him, I have to work on an iPad at best. Now I can point directly at the surface while selecting, adjusting, and repositioning clips. Or Charlie could wear it while painting to capture his movement, without it getting in the way of holding a brush.

Creative work reflects the nature of your instrument, so it’s exciting to expand the toolset and learn more about the media. Video A-B fades are pretty straight-forward, but the way that the IMU unit works isn’t nearly as predictable as a fader on a board, and I’ve gotten some unexpected results. That’s a good thing.

Even better, I can’t wait to see what other people with these gloves come up with. Tinker, modify, share.