This is a very big project I am undertaking for Crab Devil art collective in Tampa. It’s an immersive exhibit that fills a 40’x8’x8′ shipping container with rings of lights on every surface.

Check out the awesome video they posted for the Penninsularium here

The video above is an early panel prototype, just to demonstrate the sensor arrangement, the math to project sensor data to a grid of rings, the audio configuration and capablities, and a little logic to make it pretty.

I am currently working with a sheet of black MDF that LiveWork CNC’ed for me to perfectly house all the components, developing the software for the panels. I’ll post a video of the new prototypes soon. Just this week I made a major change to the hardware and read functions which will improve the isolation between the different sensors. At the moment I’m focused on being able to clearly read and propogate a current across a panel, and next I will be working on passing it on to the next panel. The communications protocols will take some development, but should not be quite as surprising and elusive as getting good readings from all the sensors involved in reading a gesture.

When the software reaches the point that the panels can auto-update in a chain, we’ll start building almost a hundred of them to fill up the shipping container from floor to ceiling.

I created these props for a local production of the Little Mermaid musical.

The trident head was made by shaping a styrofoam trident form, coating it with epoxy and dissolving the styrofoam form with acetone. I shaped dense WS2812 strips, ahered to one another facing opposite directions, in each fork of the trident. The three two-sided strips terminate near the hub.

In the staff itself, there are likewise 3 individual strips, stuck to a steel tube a’la garden center green plastic coated tree stake. All 3 run all the way from the top to the bottom, and have polyfill fiber diffusing the light. There is a USB on-the-go power stick inside at the top that provides strong portable power, and an arduino and plugs that get stuffed between the trident head and the staff. It’s all self-contained.

It looks like it changes from a regular gold trident to a magically charged one that is going to cause some crazy effects! I misted the whole thing in gold mirror paint just until it was opaque in daylight or indoor light, but the LED would still shine through brightly without having total darkness on a set. It’s essentially the same technique of a scrim or a one-way mirror.

I wrote the software/art myself, using only basic hardware libraries like neopixel. It’s particle-based, one effect is a rushing fire effect and the other is a bubbling colorful swirl effect. Both are 3d, they take advantage of the arrangement of the strips from one to the next in a cylindrical way.
The trident is controlled with a touch interface that is designed to look like a copper emblem at the head of the staff. It stays gold until you touch. A quick touch will cancel. leaving it alone will build and hold the effect as long as you want, until you touch it again to let the effect destruction sequence run until it is back to plain gold.
Ursula’s shell is designed to respond to music with different colors. Unfortunately the color range doesn’t show on camera quite as nicely as it does in person, but you can certainly get the idea. The shell has very powerful 10W RGB COB LEDs controlled by a teensy 3.2 that listens to the frequency of audio in a microphone, all hidden inside the shell. It runs on little 4 3V lithium camera cells that fold into a spiral plane and fit inside the shell.

For Sleepy Hollow, we needed a scary steed for the Headless Horseman to ride, for the grand finale scene of the children’s play.  The script suggests putting a child on a dressed up ladder, stationary in the corner of the stage, revealed in a flash of light.  Obviously, that was just not good enough for me.

I wanted to build a horse from a rolling platform, to be operated by one adult inside and a child on the horse’s back.  I wanted the head to be articulated to some degree to create some realism, but was willing to forgo realistic legs in favor of ghostly gliding for this show.

I watched some videos of the Handspring Puppet Company’s warhorse puppet,, which I found incredibly inspiring, but also much too complicated for me to build in a month and much too difficult to be operated by me and one child.  I really liked the way that the ears were articulated, and the way that the Warhorse combined real horse movements with artistic abstraction.  I found my own simpler compromise between the two for this piece.

I started with some simple mock-ups using Popsicle sticks and rivets to get the basic movements right,  I wanted to have a natural bobbing movement for the walking/galloping, an up-and-down nodding movement for whinnying, and a side-to-side motion to turn the head to look at Ichabod Crane during the chase.

horse head diagram.png

I built a skeleton from lauan, and one pipe inside another pipe for the steering / nodding mechanism.  The initial design used a wooden dowel, but the wood split during tests, so I replaced it with an aluminum pipe.  In the final version of the mechanism, I used chicago screws as axles for the pivot points in the lauan.

In addition to the head movements, I built an articulated ear mechanism, which was to be operated by the rider.  The ears, constructed from cardstock sandwiched between sheets of black craft felt, cut and folded into ear shape,  were mounted on small carboard tubes normally used for housing mason bees, which pivot inside of PVC pipe sleeves.  the mechanism is operated from the base of the neck by a lever that spans the neck lauan.  A little wool applied to the inside and outside in correct proportions finished the effect.

ear diagram.png

The eyes were sculpted by heating transparent Christmas ornament halves and squeezing them into eye shapes, lightly spraypainted black inside, and bordered with a halved corrugated plastic tube.  The deep red LEDs are mounted on large heatsinks mounted behind those shells, and wired to a very simple knob-operated dimmer circuit.

I used paper mache techniques to construct a foundation for the nose, which I later coated with a layer of water-based spray foam, followed by black latex paint.  A couple spurts of lacquer added wetness to the nostrils and mouth.  The hair on the whole head was made from roving wool, which I applied directly to the wire and paper mache foundation in layers, using a spray adhesive, pressing and hair-pulling technique to build layers.  Following is a progression of the skinning of the horse…


We plan to re-use the foundation of this horse for our summer production of Pippi Longstocking, of course it will be re-skinned with a white wool, happier, blunter teeth, and no glowing red demon eyes.  I might add some legs for fun, although that depends on how busy I am, as the summer shows have a very short timeline and the horse is not a main character.


This is a really exciting invention I came up with for the Sleepy Hollow.  I wanted the pumpkin head to talk – like really talk, and also I wanted it to explode into pieces when the horseman dramatically throws it across the stage at Ichabod.

I wanted to use projection to create the animation on a destructible pumpkin, but I knew it would be impossible to perfectly line up the position of the pumpkin or even the horse with a pre-set projection location.  I came up with a rough idea that I could conceal a simple and very durable infrared LED on a battery in the pumpkin somewhere, and I could write a program that would move the pumpkin’s projected face around in the projection area based on the position of the IR LED.

I needed a nice far infrared wavelength that wouldn’t get too much interference from stage lights and warm bodies, and a matching camera lens filter that would block out all the normal visible light to make it easier for my program to pinpoint the tracker.  I settled on a 950nm IR wavelength and bought a bag of LEDs and a filter from Amazon.  The LEDs had a voltage range that made it a perfect fit for a single AA battery, which simplifies the electronics that were going to take a beating every night.  I had 10 LEDs to work with, knowing I’d probably destroy or lose a couple in testing.

I started by prototyping the projection and tracking.  My early tests involved a different LED wavelength and camera, I ended up getting a higher speed camera and higher frequency IR setup later, but from the first test I could tell that with some tweaking my idea would work, and it gave me what I needed to start developing the software to track the LED.  I used Visual C# and a free webcam library called AForge.NET, which grabs an image from the camera every frame, and I bought the cheapest LED projector I could find that seemed to have the range I needed.  The final camera was 40fps at 800×600 resolution, a compromise between performance and low cost.  My first tests projected my own face onto a paper plate across my bedroom – my dog was not a fan of this, sorry there’s no video of that.

Next, I needed to figure out the pumpkin itself, because the kids were going to have to start working with it early on.  I liked the idea of using a real pumpkin, but there were safety and cleanup concerns having a kid throw a real pumpkin across the stage.  I didn’t want to settle for a plastic pumpkin, because I wanted it to look like a real pumpkin exploding on the stage, so I carved a pumpkin out of layered styrofoam, cut it into 5 jagged shaped jigsaw puzzle peices. and applied the “guts” to the insides with spray foam and paint-soaked yarn.  I screwed steel drywall screws to the cut edges of the pieces and superglued small neodymium magnets to one side of each screw pair so that the peices would stick to one another magnetically but easily break apart on impact.  Finally I applied a killer paint job and brought it down to the theater for a test.

There were some obvious issues with the test.  The I/R LED was very directional, so if I didn’t aim the front of the pumpkin right at the camera I’d lose the signal.  I solved this by adding a big “diffused lens” of hot glue over the LED so that the light became a bulb instead of a beam.

The positioning of the camera/projector was somewhat tricky to calibrate, so I added code to my program to make it easier to adjust the boundaries of the projection space relative to the camera boundaries.  I also needed it to be a little more forgiving about loss of signal, continuing to project to the same location slightly longer before assuming the pumpkin has been thrown, which was a simple software tweak.

tracker screenshot

The last step was to make a real jack o lantern face that could speak the correct line on cue.  That presented a bit of a conundrum.

If I had been a 3D graphics designer I might have rendered a CGI jack o lantern face speaking the words, in fact I did some experiments with Adobe’s characterizer, even tried drawing frames for a few different face shapes, similar to the way cartoons are made.  I wasn’t happy with any of those experiments.  I settled on actually applying some grease paint to my own face and using video processing effects in Adobe Premiere to mask out all but the elements that I wanted to project.

Here is the video clip of my painted face that I started with:


And here is the finished version after applying processing in Adobe Premiere:

horseman talking

I found that an animated gif could be moved around the screen much faster than a full-blown video file, so it was converted to GIF format from premiere, and the audio file was separately processed in Audacity to make it much scarier:

To get the flaming video, I first inverted my face video above, to give me white eyes, nose, and mouth, and a black face, played with the brightness and contrast quite a bit to eliminate as much grayscale and human facial features as possible, masked out the area around my face and added masked brightness/contrast layers to the inside of my mouth and eyes.  Once I had the correct black and white line-art animated face, I added flames by using youtube footage of flames with “screen” opacity mixing, and a video of a flaming marshmallow with partial transparency in each eye to achieve the licking flames.

The audio was processed in Audacity with a frequency shift, reversing the sound, applying a reverb (echo) and then reversing it forwards again. if you pause the audio player at the beginning of the word “Ichabod” it’s very easy to unpause it when the face starts to mouth that word and enjoy the combination of video and audio.

Here’s another clip from a better position (in the audience) taken after the show:


In the future, the same tracking could be used to project anything on anything whose position is not absolutely known, I could even do the face processing in real time rather than from an animated GIF.  If I were to ever commercialize this application, or use it in a situation with more advanced requirements, I’d tweak the software to include self-calibration and quadrilateral mapping instead of the simple scaling it uses now.    The software could be augmented to work with multiple cameras and projectors to cover a larger area or more angles.  Also, I might try to find a more powerful projector, this cheap one worked for a simple lantern face but to do something more I’d like a more expensive and powerful projector.



This was a fun prop made for Chitty Chitty Bang Bang.  Caractacus Potts invented a haircut machine to make some money at the fair so he could buy Chitty from the junkman.  The machine is supposed to malfunction and start smoking as it burns off Sid’s hair!

This machine had to be safe for a kid to wear on their head, fairly indestructable, and I had to throw it together on a very short timeline.

This machine is powered by 4 AA batteries, stepped down to to USB outputs which feed the two USB-powered personal ultrasonic misters.  The misters are designed to be set into a bottle for continuous use, so I had to seal up the filter cases to act as self-contained water resevoirs, and I hacked the push button for the misters into a wire I could trigger with an Arduino Nano so that both misters would go off at the same time after a set delay.  The rest of the hat is just spraypaint and spare parts – very very simple prop.


The GCP Junior Stars production of Seussical required some really bold lighting, a special DMX controlled lighting rig was developed to allow six 4-foot segments of flats to be lit individually by pairs of RGB LED strips.  This could have been accomplished with commercial theatrical lighting equipment, but budget would not allow for it, and the learning and future customization opportunities add a lot of value to the rig.  This rig went through some field repairs on a couple of occasions, so it’s a little beat up in this post-production picture.


In addition to the wall lights, scrim technique was prototyped and developed for the show, so that children could appear and disappear in a whimsical way.  This early prototype test video with Isaac in it is pretty funny:

Beside the electronic lights, the bathtub, “drooping” tree effect, and gertrudes measuring-tape derived pop-out tail feathers were handmade for this show.  It was a lot of fun!

Doreen Horn created the gorgeous seuss-inspired backdrop for whoville.

This is just a small project, it was originally intended to be a part of the Seussical show, but many other things took priority, and by the time these were finished, it would have been hard on the kids to add them to the show.

Nevertheless, I am excited to have these and plan to find a good use for them, perhaps they will end up in Headless Horseman?

They are made from a pair of strong stepper motors controlled individually by Arduino Nano microcontrollers interfaced with DMX plugs, so a light board can be used to set the speed, forward or reverse.  It does a great job of pushing or pulling a long peice of fishing line, and if both are moved in opposite directions, an object suspended between them can be made to rise and fall in addition to moving it back and forth.

One lesson that was learned here since my last experimenting with motor controllers was how to get nearly silent operation out of the controllers by altering the microcontroller’s PWM frequency.

These controllers have 8 DIP switches on the side that can be used to set any DMX channel between 0 and 255.  This isn’t strictly needed since I could hard-code any channel I want in the program, but I thought it was a nice touch, a nod to commercial theatrical equipment conventions.