Maker Ant Farm: Minecraft Skin Generation with a Kinect

Since my seemingly fragile 3D printer had never left my desk before and even in prime condition could only print an object every 10 minutes or so, I decided that I needed a backup project for the Bay Area Maker Faire last month.  I conscripted Will to help me out on a purely software Kinect based project.  After downscoping our ideas several times as the Faire weekend approached, we eventually settled on generating Minecraft player skins of visitors.  The printer ended up working fine (and more reliably than the software only project), but the Minecraft “Maker Ant Farm” was more of a crowd pleaser.

A visitor would stand in front of the Kinect and enter fieldgoal/psi calibration pose.  We used OpenNI and NITE to find their pose and segment them out of the background for a preview display.  Using OpenCV, we mapped body parts to the corresponding sections of the Minecraft skin texture.  Since we could only see the fronts and parts of the sides of a person, we just made up what the back would look like based on the front.  This was of course imprecise and resulted heads that often looked like they had massive bald spots.  Rather than trying to write some kind of intelligent texture fill algorithm on a short schedule, we just gave all of the skins yellow hard hats (not blonde hair, contrary to popular opinion).  After generating the skin, we loaded it back onto ShnitzelKiller’s player rig in Panda3D.  I had planned on writing full skeletal tracking for the rig, but ran out of time and settled on just having it follow the position and rotation of the user and perform an animated walk.  After walking around a bit watching a low res version of him or herself, the user could enter in a Twitter handle or email address to keep the skin.  The blocky doppelgänger was then dropped onto a Minecraft server instance we had running as a bot that did simple things like walk around in circles or drown.

Minecraft Skin

Despite some crashiness in NITE and the extremely short timeframe we wrote the project in, it ended up working reasonably well.  Thanks to the low resolution style and implied insistence on imagination in Minecraft, the players avoid looking like the ghastly zombies in Kinect Me.  You can see examples of some of the generated skins on @MakerAntFarm.  I hate not releasing code, but I almost hate releasing this code more.  It is very likely to be the worst I have ever hacked together, and I can’t help but suspect it will be held against me at some point.  Nonetheless, for the greater good, it’s up on github.  There are vague instructions on how one might use it in the README.  Good luck, and I’m sorry.

No Comments on Maker Ant Farm: Minecraft Skin Generation with a Kinect

Gestural Printing: Jumping the Shark on Kinect Hacks

We’ve seen a seemingly endless array of amazing Kinect hacks over the last few months, from superhero generators to obstacle avoiding quadcopters.  However, it was only a matter of time before someone came up with a hack so inane and irrelevant that it would bring shame to the entire hobby.  That time is now, and that someone is me.  I bring to you, gestural 3D printing!  Using the Kinect to track your hand, you can draw one layer at a time, with the printer following your every move.  Pushing forward extrudes plastic, while pulling your hand back will start a new layer.  Who needs difficult and confusing CAD software when you can just directly draw the object you want to print?

Really though, you can only get through 4 or 5 layers before your arm feels like it’s going to fall off, and the resulting object will look like a stringy blob of plastic vomit.  The source is in the FaceCube GitHub repository.  I don’t recommend actually using it, but if for some reason you want to, the dependencies are mindbogglingly complex.  You’ll need to install OpenNI and NITE to start with; this guide at Keyboardmods is helpful.  You’ll also need my branch of OSCeleton, which improves on hand tracking.  With the Kinect hooked up, you can run ./osceleton -n -f to start hand tracking in an Open Sound Control server.  You can then run the gestureprinter.py script, which requires pyOSC, pygame, and the RepRapArduinoSerialSender script from Skeinforge, which is also in the FaceCube repository.  Of course, you’ll also need both a Kinect and a 3D printer that is compatible with the Gcode that RepRap firmwares use.  The script is set up for my printer specifically, but it should be straightforward to tweak for others if you dare.

Gestural Print

No Comments on Gestural Printing: Jumping the Shark on Kinect Hacks