“A lot of the show is dealing with addiction behavior, and that is expressed both in their addiction to the chat room and their addictions outside in the world,” explains Michael K. Maag, Oregon Shakespeare Festival’s lighting and video manager. “All of the people in the chat room are dysfunctional in one way or another, but they’re all trying to help each other. That’s where the connection comes from, so without revealing too much, it becomes about the human contact and our ultimate desires and needs as human beings and how we connect when we’re in dire straits.”
While the audience can see each character onstage, their onscreen avatar is a different image, such as an orangutan or a skyscape, so their virtual confidantes do not know what they look like or what their names are. “There’s a lot of play based upon that sort of Internet anonymity versus reality,” observes Maag. “They don’t necessarily have to be who they really are while they’re online, and it allows for an interesting interplay in the dialogue and the expectations of the characters as they’re going through, and then how those expectations get met, or not, when they get together.”
The characters’ complicated emotional interplay was matched by an interplay of traditional lighting, LEDs, projection and pixel mapping.
In portraying the small part of cyberspace that each character occupies, four video screens approximately 8 feet by 9 feet stand at the back of the stage, just upstage of a row of four white platforms that are approximately 11 feet by 11 feet. Each of those main platforms represents a character’s world. “When they’re in that world we light them up, and then when they go online, we project down a white square that encases their environments so it’s outlined in a white digital square,” explains Maag. “There’s only one place where there is a crossover and there are black platforms in between all of the squares.”
While the characters stay within and move around their square when they are online in the story, there is some crossover that transpires as they experience events in the real world. “When the main character is talking to his sister, he is able to move among things, but she stays in her square,” he explains. “But he’s able to cross in and out of it. There are some rules there, but I don’t know that they are as important as when people are online they stay in their square. There are dream characters like this character with PTSD that crosses over the environments and is free to travel on all the platforms.”
Downstage of the four platforms, there are two columns of adjacent platforms that run three rows deep, so ultimately all 10 platforms on the floor form a T shape. They increase in angle as they go upstage. “They’re all raked, and they get higher and steeper in rake as they go upstage,” says Maag. “The downstage right platform is flush to stage floor, and then the top stage left corner, which is the highest, is two feet, nine inches off the deck. That’s before the ones that go vertical.” The whole layout is set an angle 7° off of straight up and downstage, “so that messes with your sense of symmetry as well.”
Given that the audience is in three-quarter thrust around the platform staging, a lot of lighting has to be done overhead to keep it as tight as possible to the squares and allow the actors some freedom of movement. “We had to do some light that spilled off of those platforms, but we did that subtly as we could to make sure that they stayed in their light,” notes Maag. “Of course, we enhanced the fact that they were in their squares by doing a video projection that put a white line down on the floor that defined the platform that much more. Since the rest of the environment is black, the spill light sort of disappeared, which was good. Each square is outlined by a video white band about three or four inches wide. When a person goes online, there is a “bing” and the video square irises out to the edges of the platform and creates that environment.”
At the top of the show, a scrim is placed in front of the avatar screens for different reasons, so the avatars are rear projected onto them using four Kodak Ektagraphic 35mm projectors and slides. When the scrim is removed, front projections are used while the avatars are still rear projected. There are 10 other projectors in the theatre, all Sanyo 5500s: four in the catwalks to do the floor projection; two FOH projectors to handle the upstage screens; two off-stage left and two off-stage right to do coves above the audience’s heads. The most important video image is of a waterfall that starts in the two center screens and spills down onto the center platforms until it forms a pool downstage left. It’s an integral image to the production, but it’s made up of more than just projection.
The Oregon Shakespeare Festival has a rep plot of about 240 lights, mainly ETC Source Fours, but also nine Vari*Lite VL5Bs, four ETC Source Fours on AutoYokes from City Theatrical, and a downlight system of 24 Wybron Nexeras. They use the Nexeras’ CMY color changing capabilities to easily change the color of the floor. Maag says that over and above the rep plot, each show’s lighting designer (in this case, Geoff Korf) gets about 100 fixtures to use as specials for the show. “[Designers] have a lot of tools to work with,” says Maag.
Beyond the lighting design, scenic designer Sibyl Wickersheimer conjured an idea of installing LEDs in the platforms and pixel mapping them as video to create the movement of water. “We ended up putting in 503 RGB pixels in those platforms,” says Maag. “They’re blue [for the water], but they’re RGB, so we can mix them to whatever color we want to. It was very interesting laying that out.” The pixel density isn’t distributed evenly on the set; the top upstage-right platform only has three pixels, and the downstage-left platform needed 327. The variation is needed thanks to the “pool” of water the waterfall creates as it rushes down the stage and collects in the final platform.
The crew spent time determining where to place the pixels, then properly drilling holes in the platforms. They installed the pixels, capped them with Plexiglas, then painted on top of them so that the floor looked smooth. “You don’t see that there are holes in the floor with pixels in them until we turn them on, which is really cool,” says Maag. “That worked out well. The painters did a great job of matching the floor treatment but also letting it be clear so that we could put our light through.”
They used this opportunity to try the pixel mapping software native to the ETC Eos system after version 2.0 for the first time. The pixel map of the space totaled 40,392 pixels (132 pixels across by 306 pixels deep), but they only ended up using 503. “We had to create a huge pixel map and only populate very few squares in there, so it took a little bit of time to actually map those, give those addresses and then teach the Eos which pixels in that gigantic pixel map were the ones that needed to turn on,” recalls Maag. “The Eos has this interesting limit that pixel maps can only contain 16,304 pixels, so we had to do three pixel maps to cover the floor. We just stacked them vertically — upstage, mid-stage, and downstage — to lay our content on top of. That was really fascinating.”
Wickersheimer wanted them to match the waterfall that was being projected on the floor with the pixels coming up from the floor. “We were trying to run the same content that was being sent through Dataton Watchout to the projectors down on the floor,” says Maag, so they brought that content into the Eos console and use the pixels to put that up.
The first thing they learned doing this was that the high definition image that they were using in Watchout would not work well in a pixel map that was only 132 by 306 pixels. They had to down-res it to make it a reasonable size and get it to work. “It ended up working out really well — as the waterfall is coming down, the pixels are lighting up and being bright at the right moment when the waterfall is there, creating that movement of the water coming down from below.”
But the waterfall wasn’t only done by pixel-mapping. To initiate the waterfall effect, an actor opens a box of ashes and “throws them over the falls.” Inside the box are 64 LEDs that the team programmed to flicker using an Arduino board when the box is opened. “The box flickers, then the floor ripples, then the projection of the falls from above starts on the vertical screens and washes down the platforms, enhanced by the LEDs in the floor.”
The pixels are used for other things beyond the waterfall, too. “We had a set of them that, when we were doing the waterfall, looked like they were randomly placed inside of that, but when we turned them on, lined them up and chased them, they were an airport runway,” says Maag. “So when one person’s flying to Japan, we could do the flying to Japan and landing. We could do the runway along this long map of pixels on the floor. There’s another scene where someone is taking a bath, and we could do a small little bit of it into the bathwater in the floor, just kind of rippling. It lent a whole other dimension of light to the space and allowed us to convey water in a very special way that I think matches up with the way that the play uses it.”
Given that Water By The Spoonful runs in rep with A Comedy of Errors, the crew faced the challenge of getting everything changed over in a two-and-a-half to three-hour span. “The crew is amazing,” declares Maag. “The rep plot changes between shows, so color gobos and shutter cuts all get changed. We don’t refocus the lights in the rep plot, we just do shutter cuts because the sets change. It’s a long process, and we work closely with guys on the deck because we’re disconnecting 503 pixels in the floor. You’ve got signal, you’ve got power supplies, you have the run between each pixel, you’ve got a lot of wires in the floor and connectors that need to be made and not jumped upon by over eager guys that are ripping platforms out and putting new ones in.”
There was a lot of coordination required in terms of getting things in and out, and A Comedy of Errors was also electrics-heavy, with a lot of deck electrics built into the set, so both shows necessitated careful planning and changeovers.
In the end, Maag feels that the intimate show managed to use all of its gear in a way that was subtle and effective. “The great thing about the play, when you get done seeing it, is that you really have no idea that you’ve been in an environment that is so highly technological,” he remarks. “It’s right out there on the edge — we’ve got pixel mapping, RGB pixels on the floor, 10 projectors, slide projectors, moving lights everywhere and color changing lights everywhere. It’s pretty amazing, but it all comes out to make a beautiful show. The director [Shishir Kurup] really put together something special, and Sibyl Wickersheimer gave us this very challenging concept of getting these pixels on the floor. Geoff Korf worked very close with her on the projections and lining up the pixel maps, so they ended up being really good teammates as far as color coordination and movement coordination. It was a big challenge for us and our relatively small video department, [which] is our production supervisor, and he had one assistant on the show. They did a great job putting it together. It was very fun.”