Follow Us


Follow Us

Copyright 2014 Brand Exponents All Rights Reserved

LV Tech Hackathon – Captain Planet!



This past weekend I brought about 10 students to the 2013 Lehigh Valley Tech Hackathon. The event brings together designers, developers, and engineers with the goal of creating a “product” at the end of the weekend.  On Friday everyone was able to pitch an idea and we formed teams.  I ended up on a team with 4 of my Kutztown students: Kyle Desimone, Jerry Cavill, Kory Walton and Trevor Dilg, and 2 friends Ryan Hickey & Donna Chastain.  Our concept was absurd, we wanted to create an interactive application and physical objects around the theme of the brilliant 90’s cartoon Captain Planet. The gist of the cartoon is that 5 Planeteers have power rings and when they come together they summon Captain Planet, who according to wikipedia has the following powers:

  • Weather and climate manipulation
  • Ability to generate and control earth, fire, wind and water
  • Near invincibility
  • Mullet
  • Invisibility
  • Telepathy
  • Flight
  • Super strength

Team Captain Planet:


To make our Captain Planet concept come to life, we decided to build a series of ridiculous elements which included custom 3D printed rings fitted with RGB LEDs controlled by an arduino, a capacitive touch ring powered by a makey makey, a servo controlled captain planet head that moved up and down, a website that tracked the captain, a text message calling tree that alerts users when the captain is activated (powered by nexmo, which is awesome by the way), and a Processing sketch that ties everything together.

So first the rings. They were modeled in Lightwave 3D, and were hollowed out to allow space for the LED which sticks out through the opening in the top.  The file was converted to an STL and printed on a 3D printer at the Northampton Fab Lab. Download the STL and print your own Captain Planet ring!



The next step in the process was to wire the RGB LEDs to a an arduino that would power and control the lights when triggered. The sketch waits for a serial data (sent from Processing), parses it and lights the appropriate LEDs with our selected colors (and blink sequences). Download the aduino sketch.











Using some hobby servo motors we also were able to make a cardboard captain planet and control his head motion, but during the 40 or 50 tests of the process we burnt-out the servos, leaving a lifeless captain planet mascot full of broken electronics:


The final step was to bring it all together in Processing. The makey makey read physical input and sent it to a computer as keyboard input, so we mapped different inputs to keys and had Processing listen for those keys.  We wrapped a wire around the front of the rings so it could detect the presence of another person when touched (because humans conduct electricity):


The next piece was the Processing sketch (download it) which waited for “keyboard input” from the makey makey, when it received input it triggered a video to play (corresponding to the appropriate Planeteer), send a message to the arduino to light the appropriate LEDs, and then waited for the next input.  When all 5 Planeteers had been activated, the full one minute long captain planet theme song & video was displayed, and all the rings pulsed random light colors.  After the video ended all the users in the system received a text message alert and data was added to the Captain Planet Locator, which is programmed with PHP a simple mySQL database.


The whole event was a ton of fun, and we were all out of our comfort zones for the whole project.  We knew from the beginning that the project was ridiculous, but we had a great time making it.  We didn’t win (which is total BS), but the winning team totally deserved it, so we weren’t bitter (great job Joe Fritz and team!).



Processing at LV Tech

I’m a little late posting this, but earlier this month I lead a Processing lecture at Lehigh Valley Tech. I went over the basics of what Processing can create and some simple programming examples, including a drawing app that reacts to audio input and a generative art piece that infinitely draws circles.

I’ve posted the presentation PDF and sample files.

In related news, I am teaching a “Creative Coding” class this summer in Processing at Kutztown. The class is open to students and the community.

Tech of Kolam – Part 2

To follow up on my last post about the animated kolam piece, I wanted to include some details on the AppleScript I used to launch the movie and full-screen it.  As I mentioned before, the script is admittedly a little hacky, but I certainly got the job done.  Here’s the AppleScript which was called in the last line of the loop in my shell script (from the last post):

tell application “QuickTime Player”


end tell

delay 1

do shell script “open -F ~/Documents/kolam-drawings/final.mpg”

tell application “QuickTime Player”


  delay 1

  tell application “System Events”

     keystroke “l” using {command down, option down}

     keystroke “f” using {command down, control down}

  end tell

  play document 1

end tell


So, here’s what’s going on.  We already have a new MPG generated from the shell script, so we first kill quicktime and explicitly open the file from the command line.  The do shell script “open -F ~/Documents/kolam-drawings/final.mpg” seemed to be the best way to open Quicktime without retaining any old preferences, specifically the -F flag opens it fresh.  The reason this is an issue is that in Lion, Quicktime (all Apple apps really) try to remember your previous settings, so if you previously had a video open, it will open it again, if you had the video set to loop, it will set it again.  So if you run a script that opens a file and turns on looping, the looping will be toggled on and off every time.  So opening it fresh forces a new state every time.

One the video is open, I literally send key commands to loop and fullscreen the video (the keystroke commands).  From what I can tell, Quicktime app is fully scriptable, but I was unable to reliably access the loop and full screen settings from the built-in scripts.  And finally the video is played & then is looped until the shell script calls to the AppleScript, which kills the video and starts the process all over again.  If anyone knows a better way to do this, please email me, I’d be happy to post variations on this page.