We’ve seen a seemingly endless array of amazing Kinect hacks over the last few months, from superhero generators to obstacle avoiding quadcopters. However, it was only a matter of time before someone came up with a hack so inane and irrelevant that it would bring shame to the entire hobby. That time is now, and that someone is me. I bring to you, gestural 3D printing! Using the Kinect to track your hand, you can draw one layer at a time, with the printer following your every move. Pushing forward extrudes plastic, while pulling your hand back will start a new layer. Who needs difficult and confusing CAD software when you can just directly draw the object you want to print?
Really though, you can only get through 4 or 5 layers before your arm feels like it’s going to fall off, and the resulting object will look like a stringy blob of plastic vomit. The source is in the FaceCube GitHub repository. I don’t recommend actually using it, but if for some reason you want to, the dependencies are mindbogglingly complex. You’ll need to install OpenNI and NITE to start with; this guide at Keyboardmods is helpful. You’ll also need my branch of OSCeleton, which improves on hand tracking. With the Kinect hooked up, you can run ./osceleton -n -f to start hand tracking in an Open Sound Control server. You can then run the gestureprinter.py script, which requires pyOSC, pygame, and the RepRapArduinoSerialSender script from Skeinforge, which is also in the FaceCube repository. Of course, you’ll also need both a Kinect and a 3D printer that is compatible with the Gcode that RepRap firmwares use. The script is set up for my printer specifically, but it should be straightforward to tweak for others if you dare.