Wednesday, March 10, 2021

Phenakistoscopes, zoetropes, and a stepper

I've been a fan of Tim Minchin ever since I first heard White Wine in the Sun. Last year, he released a music video for Leaving LA, a scathing take on his experiences in Los Angeles. It's a great song but the music video was amazing. Animator Tee Ken Ng filmed live-action video, printed each frame, and cut out figures to place on a zoetrope. You can see the laborious process in the behind-the-scenes video:


I really liked seeing the focus on hand-cut figures and printing each frame, rather than rotoscoping in Aftereffects or automating the process as Tee Ken mentions. But I started thinking of how this process could be automated, or at least simplified, for middle school students.

I broke it down into two steps. The first is film the live action on video. For beginners, it would make sense to lock off the camera on a tripod and enforce the flatness of plane, so keeping figures relatively similar in scale. This would prevent scaling issues in the cutouts if figures moved towards camera and parts were cut out of the frame.

Next, extract each frame as a separate image file. Adobe Premiere and Illustrator can do this simply by exporting the video as a PNG sequence. However, we do not give student licenses to Adobe CC so an even easier process is to use an online animated GIF maker. ezgif.com allows you to upload a video file, choose the frames per second, and "split" it into individual GIF frames. It even packages it as a simple ZIP file:


Tee Ken printed each frame and hand-cut the outlines using a blade. However, I imagine with the right background and defined outlines you could automate this cutting process on a Cricut or a laser cutter. I haven't tested this yet, but filming against a chromakey would allow for transparent backgrounds that simplify the outline trace process. Since you just need the outside line for cutting a batch process or action could colour the line and produce vectors ready to be stacked and organized on a cutting bed. See biologic shape analysis for intriguing uses of AI in Python to further automate shape outlines.

Cutting is step three, so step four would be to mount each cutout on a zoetrope or rotating platform. What makes the persistence of vision effect work is the rotating platform or disc is spinning at a constant, exact speed. That speed is synced to the frame rate of the camera and shutter speed adjusted accordingly to reduce flicker or strobing. In a perfect environment you could build a motor that ran at an exact speed (like a record player or CD drive motor) and instruct students to only use one type of template, e.g. 48 images per second on one disc. In reality, a bit of variety is nice so an adjustable motor would be ideal.

I still had a NEMA 17 stepper motor (1.7 inch x 1.7 inch faceplate) from an old Printrbot 3D printer and made a quick setup controllable with an Arduino Uno, 10k potentiometer and an Easydriver. I took a chance and powered the stepper off the Vin of the Arduino, so far it's worked okay. The Fritzing breadboard diagram is from Schmalz Haus:


I'm pretty excited about the possibilities of this and even having students create their own motor assemblies. The easiest would be through a micro:bit and a DC or stepper controller running at a constant speed. Lots of possibilities for even building the whole contraption out of cardboard. To be honest, I'm probably not going to be continuing this for a few days. Staring at the rotations too long makes me quite dizzy and gives me a headache: