How do you take two different DMX sources and combine them into one signal? The answer, until recently, has been to use a piece of hardware that can receive both DMX source inputs and then merge them back into one DMX signal that is then sent to the fixture(s). This has been simplified somewhat, however, by the software engineers at ArKaos who developed Kling-Net. Kling-Net was created as an alternative communication protocol for pixel mapping video on an LED fixture. As it is an Ethernet-based protocol, it is sent out from an ArKaos media server via an Ethernet cable and then connected directly to the fixture or via a switch and then out to several fixtures. This communication protocol is not the same as DMX or Art-Net, and it does not require being merged back into the DMX stream before communicating with the fixture.
—From “Video Digerati” by Vickie Claiborne, PLSN, Sept. 2013
I have found that the biggest “gotcha” when working with a visualizer comes not from the differences with the plot versus reality, but rather with the fixtures. Every visualizer has its own library of lighting fixtures similar to the libraries that exist in lighting consoles. However, the visualizer fixture libraries must contain much more data than just the DMX mapping, because they have to emulate the actual output. In some cases, this can result in incorrect information that can lead to programming errors. For instance, if the zoom operates differently in the real fixture than it does in the visualizer library, you will find that all your programming relating to zoom will be wrong. Every narrow zoom you saw in the visualizer would be full-wide zoom with the real fixtures.
—From “Feeding the Machines” by Brad Schiller, PLSN, Sept. 2013
I believe most people have a favorite color. I’m partial to a deep blue myself. I asked around. Guys seem to like blues and purples. Girls like red and pinks. Some musical acts hate certain colors and have actually asked me to never light them in a certain color, while others only allow themselves to be lit in certain colors. But one thing I found is that nobody really digs yellow. I personally use it carefully at times. Like when emulating sunrays on a rear cyc, or stabbing fans of light in a rock show. If I am lighting a punk rock band, I may use it because it looks really horrible. But it seems like if you illuminate 99 percent of the objects in the world in yellow, they will look nastier. Perhaps there is a theater character other than Sponge Bob or The Simpsons who would look good in this hue, but I have yet to see it.
—From “LD-at-Large” by Nook Schoenfeld, PLSN, Sept. 2013
From traditional “linear” TV to TiVo to Netflix to streaming media and Google Glass, the world of television is changing fast, with no signs of stopping. The big question is — what does the future hold for broadcasting? Where’s this all leading? Should they begin tearing down the broadcast towers and antennas? Honestly, my crystal ball doesn’t see that far into the future, but something is definitely stirring in the near term…The direction is summed up by Netflix, in a recently published white paper called Netflix Long Term View: “Over the coming decades and across the world, Internet TV will replace linear TV. Apps will replace channels, remote controls will disappear, and screens will proliferate. As Internet TV grows from millions to billions, Netflix, HBO and ESPN are leading the way.” Stay tuned and brace yourself, people — it’s changing fast.
—From “Video World” by Paul Berliner, PLSN, Aug. 2013
Is 3D a passing fad, or will it become a regular component of the live touring effects arsenal? If you ask ESPN, which launched with much ballyhoo its slate of 3D sports channels in 2010, the most recent answer would be “fad”…but Primus might beg to differ. The trio embraced the concept for both legs of their 2012-2013 U.S. Green Naugahyde tour, which ended in June. Los Angeles-based 3D Live was the 3D technology provider for the shows, whose three-dimensional center screen was flanked by the 2D video screens the band has used before, in some cases with the content from the two overlapping. What made the application of 3D tricky was the need for the audience to wear anaglyphic glasses to get the 3D effect, however. Aaron K. Craig, Primus’ lighting designer, says they band had to carry as many as 30,000 pairs of plastic glasses with them on their tour of 1,500-seat clubs and 3,000-seat theaters.
—From “The Biz” by Dan Daley, PLSN, Aug. 2013
Programming very large arrays of lighting fixtures is an enormous task for any programmer. Many productions are creating huge walls of LED products, with some tours out right now that have arrays of more than 400 three-celled fixtures! That’s more than 1,200 pixels of control, just for a single array. When preparing to program an array, you must first determine the needs of the production and then select the best tools for the job. You may find that combining technologies might provide the best option, or you may select one method of control and run with it. Either way, it is important for you to understand the possibilities and always be prepared to hand-program very large arrays of lighting fixtures.
—From “Feeding the Machines” by Brad Schiller, PLSN, Aug. 2013
For Foster the People’s performance at the Firefly Festival, June 21-23 in Dover, DE, production designer Trevor Stirlin Burk collaborated with the Los Angeles Contemporary Dance company, and their dancers — not mechanized automation — turned the seven three-sided towers (periactoids) supporting lighting and video elements. Although the largest of the towers weighed more than three tons, they were attached to milled aluminum bases on ball bearings and equipped with outrigger handles so that the dancers could rotate the units. “To have the dancers control the towers was important to me. The organic imperfection of having seven performers controlling those huge towers in unison was something special,” Stirlin Burk said.
—From “Designer Watch” by Debi Moen, PLSN, Aug. 2013
With the advent of the upstage video wall at concerts, the lighting systems have had to fly higher, span wider, gain massive lumens and require fleets of trucks. But this summer, for one client, we’re moving arena shows outside with a scaled-back rig. I still have three trusses of movers and a splattering of strobes. But upstage, I have a wall of voodoo. Basically, it’s four vertical rows of Elation Razors splattered with some Sharpys and strobes. I have taken my big-ass light show and hacked half the fixtures and a guy off it. But with torms from Daric Bassan at Upstaging — they’re 15 feet tall, had 15 fixtures on them, and couldn’t be more than two feet wide or they would block my video wall and scenery — the system looks large.
—Nook Schoenfeld, from “LD-at-Large,” PLSN, Aug. 2013
With the rise in popularity of electronic dance music (EDM), DJs are touring now with elaborate visual productions. The majority of these DJ acts feature visuals that are programmed using a media server but not operated using a lighting console. Instead, many visual jockeys (VJs) are looking to other types of communication protocols for the ultimate in control. One such protocol, not as often used in lighting today like it once was, is MIDI. Combine this interface protocol with a touchscreen GUI, and you’ve got a powerful control interface with virtually unlimited configurations. Two of the more recent MIDI/Touchscreen interfaces that are making appearances at your local EDM venue nightly are TouchOSC, developed for iPhone, iPad and iPod Touch, and SmithsonMartin’s Emulator KS-1974. While MIDI isn’t new, software developers are continually inventing new and innovative ways to use it for lighting and video.
—For more details, check out Vickie Claiborne’s “Video Digerati” column, PLSN, July, 2013
It is important for programmers to understand the various methods available to create looks and to learn the benefits of each. Then you will be able to program amazing looks that go beyond the normal pre-programmed selections built into many consoles. There’s a look I like to call “kicks” which involves the beams from automated fixtures pointing toward the audience and continually traveling in an upward fashion. Only the upward sweeps of light are visible, with no indication of the lights moving back down to their starting position. The lights do not all do this at the same time, but rather appear to be randomly and continually moving upward. Before lighting consoles had effects, programmers would need to create a complex, multi-step chase for the upward movement of the fixtures and to reset the fixtures to their starting position. Console effects allow you to automate functions and create repeating movement. Today, consoles offer advanced effects tools. You can tell the console to apply an intensity effect that only is enabled for a specific portion of the effect duration. With three or four button presses, you can easily create what used to take many steps to create. Whichever method you choose to program is ultimately up to you. I find that I often program via different methods depending upon the production’s requirements and the amount of time available.
—From “Feeding the Machines” by Brad Schiller, PLSN, July, 2013