Thingidiff: Visualizing 3D Model Diffs with Thingiview.js

Thingidiff

Thingiverse is an enormous resource for mostly open source, ready to print 3D objects.  It conveniently has both a built in understanding of derivative objects and a web based 3D object viewer.  It has no mechanism for combining the two though, making comparing any two objects a matter of either visual guesswork or downloading and comparing the files against each other.

Thingidiff is my fork of Thingiview.js allowing for web based visual comparisons between related 3D objects.  Colors and opacities can be set for faces that are the same in both objects or unique to one or the other.  The obvious use cases for this are showing a diff between a derived object and its original or showing differences in revisions of a work in progress.  Both of these cases are in evidence on the example page.  Note that between this being my first project in Three.js or Javascript at all and the currently mercurial state of WebGL support in web browsers, there are probably going to be nits, bugs, or even outright computer exploding failures in your experience.  I’m interested in bug reports though, if you would be willing to drop them on the github project’s Issues page.  The code itself is nearby.

Satellites on a Snow Globe Spherical Display

Continuing on the theme of interesting things to do with an interactive globe, I added a live satellite tracking mode to sosg.  The program polls a local PREDICT server for location and visibility information about the satellites being tracked by it.  It draws the name of each satellite and the path it is following in red.  It also draws a little icon that turns green when the satellite is visible overhead. There is enough civilization in close proximity to my apartment that I can’t see actually see one pass by, but it is nice to know they are there above me.  As before, the ISC licensed code is available on github at git://github.com/nrpatel/SnowGlobe.git

Science on a Snow Globe Spherical Display

Science On a Sphere is a NOAA project typically using four projectors to display planetary data on a six foot diameter sphere.  As a federal agency, NOAA publishes data that is not copyrightable.  These public domain datasets are pretty impressive, ranging from plate tectonics to solar storms.  They are also insanely high resolution, with mp4 videos and jpg images at 2048×1024 and 4096×2048.

To shrink this four projector, five computer, high resolution science center exhibit down to a picoprojector, old laptop, bathroom lighting fixture setup, I had to move beyond my unoptimized python scripts to SDL, OpenGL, libvlc, and GLSL.  I wrote a program called sosg, Science On a Snow Globe, which reads in images and videos and displays them in the correct format for Snow Globe.  Doing the equirectangular to fisheye transform in a fragment shader is extremely lightweight, even with GMA graphics.  Using libvlc makes video decoding quite performant as well, despite the resolution.

The program is the closest I’ve written to “shippable” in recent memory, but there are some rough spots.   I ran into a bottleneck trying to decode high resolution jpgs in real time, so currently sosg does not support the image slideshows that the full size SOS does.  It also doesn’t attempt to read in .sos playlist information.  Basically, it is not an SOS replacement, just a cheap and cheerful way to view planetary data on your desktop. Unlike the original, it is also available under a permissive license and can be cloned directly from git://github.com/nrpatel/SnowGlobe.git.

RepRap Controlled Time-Lapse Photography

While capturing the time-lapse last week, John and I ran into two irritating issues.  The first is that the moving platform causes the object being printed to come in and out of the focal plane of the camera and makes for a jarring video.  The second is that because the interval between photos is constant, some large and slow layers will have multiple shots taken while several consecutive quick layers can be skipped entirely.  The solution to both of these is to dynamically remote trigger the camera from the printer.

I wrote a Skeinforge photograph plugin that inserts a new G-code command, M240, which tells the printer to trigger a photograph.  The module offers three modes.  End of Layer, as demonstrated by Yoda above, is the simplest.  It takes one picture at the start of the first layer and then another at the end of each layer of the print, resolving only the second of the aforementioned issues.  Corner of Layer takes a picture at the minimum Y,X of each layer.  Least Change between Layers tries to take shots that are as close as possible to each other from layer to layer.  I had the most visually interesting results with the last setting, as shown in the Flower print up top.  The module can be downloaded from github, and installation instructions are included within its text.

Infrared Trigger

The other half of the control scheme is triggering the camera from the RepRap.  Since I didn’t want to risk coupling my T2i directly to the printer, I went for emulating a Canon RC-1 Remote, which has been thoroughly reverse engineered.  The hardware is simply an 850nm infrared LED in series with a 180 ohm resistor connected to one of the I/O pins on the Arduino Mega.  I chose pin 23 because I could solder to it without pulling my RAMPS board off.  The software side is equally simple.  For this, I forked the excellent Sprinter firmware to respond to M240 and send the correct pulse over the IR LED.  My fork is on github, but the diff that adds M240 support is the interesting bit.

Time-Lapse of a RepRap Print

John visited recently and suggested that we bring another photographic production to the world: this time, a time-lapse of the RepRap printing out an interesting looking object.  After some frustrating attempts to install the Canon EOS Utility, we just used an intervalometer directly on my T2i with the Magic Lantern firmware.  In case you want to try it out and to save me a lot of Googling in the future, here are the mencoder parameters to generate a sanely sized video from high resolution stills.

mencoder -ovc lavc -lavcopts vcodec=mjpeg -mf fps=10:type=jpg -vf scale=960:720 'mf://*.JPG' -o timelapse.avi

Depending on which project gets swapped into my next free time slot, I may have another post soon exploring an extension on this that John and I discussed.

Snow Globe: Part One, Cheap DIY Spherical Projection

Earth in Snow Globe

Since reading Snow Crash, I’ve been drawn to the idea of having my own personal Earth.  Because I’m stuck in reality and the virtual version of it is always 5 years away, I’m building a physical artifact that approximates the idea: an interactive spherical display.  This is of course something that exists and can likely be found at your local science center.  The ones they use are typically 30-100″ in diameter and cost enough that they don’t have prices publicly listed.  Snow Globe is my 8″ diameter version that costs around $200 to build if you didn’t buy a Microvision SHOWWX for $600 when they launched like I did.

Lens mount

The basic design here is to shoot a picoprojector through a 180° fisheye lens into a frosted glass globe.  The projector is a SHOWWX since I already have one, but it likely works better than any of the non-laser alternatives since you avoid having to deal with keeping the surface of the sphere focused.  Microvision also publishes some useful specs, and if you ask nicely, they’ll email you a .STL model of their projector.  The lens is an Opteka fisheye designed to be attached to handheld camcorders.  It is by far the cheapest 180° lens I could find with a large enough opening to project through.  The globe, as in my last dome based project is for use on lighting fixtures.  This time I bought one from the local hardware store for $6 instead of taking the one in my bathroom.

I’ve had a lot of fun recently copying keys and people, but my objective in building a 3D printer was to make it easier to do projects like this one.  Designing a model in OpenSCAD, printing it, tweaking it, and repeating as necessary is much simpler than any other fabrication technique I’m capable of.  In this case, I printed a mount that attaches the lens to the correct spot in front of the projector at a 12.15° angle to center the projected image.  I also printed brackets to attach the globe to the lens/projector mount.  The whole thing is sitting on a GorillaPod until I get around to building something more permanent.

Snow Globe

Actually calibrating a projector with slight pincushion through a $25 lens into a bathroom fixture attached together with some guesswork and a 3D printer is well beyond my linear algebra skill, so I simplified the calibration procedure down to four terms.  We need to find the radius in pixels of the circle being projected and the x and y position of the center of that circle for starters.  The more difficult part, which tested my extremely rusty memory of trigonometry is figuring out how to map the hemisphere coming out of the fisheye lens to the spherical display surface.  For that, we have a single number for the distance from the center of the sphere to the lens, in terms of a ratio of the projected radius.  The math is all available in the code, but the calibration script I wrote is pretty simple to use.  It uses pygame to project longitude lines and latitude color sections as in the image above.  You use the arrow keys to line up the longitude lines correctly to arrive at the x and y position, plus and minus keys to adjust the radius size until it fits the full visible area of the sphere, and 9 and 0 to adjust the lens offset until the latitudes look properly aligned.  What you end up with is close enough to correct to look good, though as you can see in the images, the projector doesn’t quite fit the lens or fill the sphere.  The script saves the calibration information in a pickle file for use elsewhere.

Projected sphere

Going back to the initial goal, I wrote a script to turn equirectangular projected maps of the Earth into roughly azimuthal equidistant projected images calibrated for a Snow Globe like the one above.  There are plenty of maps of the former projection available freely, like Natural Earth and Blue Marble. Written in python, the script is quite slow, but it serves as a proof of concept.  The script, along with the calibration script and the models for the 3D printed mounts are all available on github.  I’ve finally fully accepted git and no longer see a point in attaching the files to these posts themselves.  I put a Part One in the title to warn you that this blog is going to be all Snow Globe all the time for the foreseeable future.  Up next is writing a faster interface to interactively display to it in real time, and if I think of a good way to do it, touch input is coming after that.

Download from github:
git://github.com/nrpatel/SnowGlobe.git

For Sale: RepRap Parts for Bitcoins

SAE Prusa Mendel RepRap Parts

Bitcoin is exactly the kind of fantastic real life science fiction kind of project that I enjoy: a peer to peer, anonymous, cryptographically secure currency.  I’m not even an armchair economist, but I suspect the hardest part of starting any new economy is the chicken and the egg problem.  Sellers won’t join the market unless there are potential buyers, and buyers won’t join unless there are people selling things they want to buy.  Unfortunately in the case of Bitcoin, both the chicken and the egg have been eaten by the monster called currency speculation.  It is likely that the majority of actual transactions are between speculators and exchanges, taking advantage of volatility to make a profit in BTC or USD.  Half a paragraph later, I’m still not an economist, but I also suspect that as a larger fraction of the economy goes to goods and services, the currency will stabilize, encouraging more people to use it.  Therefore, I am doing my part in bootstrapping the Bitcoin economy by using a project that loves to bootstrap.

Wade's Extruder and spare parts

I’m selling a set of SAE Prusa Mendel parts printed on the Mendel used in many of my recent projects.  The parts are from the current files in the PrusaMendel git repository, and are printed in PLA.  They are quite clean and strong, but may need a little work with a knife or drill bit.  The Wade’s Extruder and PLA bushings from the repository are also included.  But wait, there’s more!  Between getting misaligned on the trip home from Maker Faire and a torn belt, my printer was in fairly rough shape for a few weeks.  While repairing it, I printed RepRap parts to test the calibration.  I’m including the usable parts printed during that time and some more good spare parts I printed recently; this is the pile on the left in the bottom picture.  The full set of good parts from the top picture and the Wade’s Extruder are in separate bags.

I’m selling this set for the hopefully reasonable price of 5 BTC, shipped USPS Priority Mail to anywhere in the US.  At the exchange rate at this moment, that is roughly $72.50.  It could be $20 or $200 by tomorrow for all I know, but I’m willing to take the risk if you are.  Email me, and we can arrange the transaction. Sold!  There was less interest than I was hoping for, so I probably won’t be doing it again.

Maker Ant Farm: Minecraft Skin Generation with a Kinect

Since my seemingly fragile 3D printer had never left my desk before and even in prime condition could only print an object every 10 minutes or so, I decided that I needed a backup project for the Bay Area Maker Faire last month.  I conscripted Will to help me out on a purely software Kinect based project.  After downscoping our ideas several times as the Faire weekend approached, we eventually settled on generating Minecraft player skins of visitors.  The printer ended up working fine (and more reliably than the software only project), but the Minecraft “Maker Ant Farm” was more of a crowd pleaser.

A visitor would stand in front of the Kinect and enter fieldgoal/psi calibration pose.  We used OpenNI and NITE to find their pose and segment them out of the background for a preview display.  Using OpenCV, we mapped body parts to the corresponding sections of the Minecraft skin texture.  Since we could only see the fronts and parts of the sides of a person, we just made up what the back would look like based on the front.  This was of course imprecise and resulted heads that often looked like they had massive bald spots.  Rather than trying to write some kind of intelligent texture fill algorithm on a short schedule, we just gave all of the skins yellow hard hats (not blonde hair, contrary to popular opinion).  After generating the skin, we loaded it back onto ShnitzelKiller’s player rig in Panda3D.  I had planned on writing full skeletal tracking for the rig, but ran out of time and settled on just having it follow the position and rotation of the user and perform an animated walk.  After walking around a bit watching a low res version of him or herself, the user could enter in a Twitter handle or email address to keep the skin.  The blocky doppelgänger was then dropped onto a Minecraft server instance we had running as a bot that did simple things like walk around in circles or drown.

Minecraft Skin

Despite some crashiness in NITE and the extremely short timeframe we wrote the project in, it ended up working reasonably well.  Thanks to the low resolution style and implied insistence on imagination in Minecraft, the players avoid looking like the ghastly zombies in Kinect Me.  You can see examples of some of the generated skins on @MakerAntFarm.  I hate not releasing code, but I almost hate releasing this code more.  It is very likely to be the worst I have ever hacked together, and I can’t help but suspect it will be held against me at some point.  Nonetheless, for the greater good, it’s up on github.  There are vague instructions on how one might use it in the README.  Good luck, and I’m sorry.

Physical Keygen: Now for Disc Detainer Locks

ABUS Plus Disc Lock

The Physical Keygen post got reactions, but there was a common claim among many of them that it was just a gimmick because there are more practical ways of getting past basic Schlage and Kwikset pin tumbler locks.  I agree with that, and I’ll also admit that a fair number of my projects are gimmicks, or as a stretch, art.  Schuyler Towne of Open Locksport saw past the gimmick (or art) and into the possibility of printing keys for more interesting locks.

He stopped by recently with a collection of said locks, and over the period of a few hours we determined that keys for disc detainer locks were printable and created a nearly working ABUS Plus key.  He left me a cutaway lock, and over the next week, I refined the model to the point of working straight off of the printer.  Despite being a higher security lock than the SC1 or KW1 pin tumblers I was working with before, the key is much easier to print accurately.  The OpenSCAD model is linked below, and like the last files, you simply edit the last line to match the code for your key.

The ABUS Plus and other disc detainer locks are much more common in Europe than the US, but we do have a pretty ubiquitous example around here.  After the Bic Pen debacle in 2004, Kryptonite switched their bicycle U-locks from tubular to disc detainer.  I designed a model off of the key from the Kryptonite Evolution I have, but as of yet, I have not successfully opened the lock with it.  The key is smaller and thinner than the ABUS Plus, causing it to flex too much to effectively turn the last few discs.  I’ve posted the file anyway, in case someone has stronger plastic or an idea to strengthen the model.

Edit: The Kryptonite key works. I tightened my X and Y belts and printed it a bit slower. Apparently some of the blobbing on the corners before was catching on disks.

Download:
abus_plus.scad
kryptonite.scad

Physical Keygen: Duplicating House Keys on a 3D Printer

3D Printed House Key

It occurred to me recently that I had printed almost nothing actually useful on my RepRap 3D printer, aside from parts to improve on or build more RepRaps.  I am rectifying that with this project.  The goal here is to generate working house keys by inputing the key code of the lock into a parametric OpenSCAD model.  Instead of having to explain to my landlord how I ended up with a wedge of plastic jammed in my front door, I ordered a box of (well) used locks and latches from eBay to experiment on.  Luckily, the lot includes both Kwikset KW1 and Schlage SC1 locks, which are the two most commonly found in the US.  I created an SC1 model to start with, but I’ll probably make a KW1 soon.  I’ve uploaded the KW1 model now as well.

Key in Lock

Designing the key model was actually pretty straightforward.  I measured a key with a ruler and calipers and created an approximate model of it that is reasonably easy to print.  I then got pin depth specifications and parametrically differenced them out of the model.  To generate new keys, you can just edit the last line of the file and enter in the key code for your key.  If the code isn’t written on the key, you can measure the height of each bit and compare to the numbers in the Root Depth column on the aforementioned pin depth site.  Perhaps more nefariously, you could implement something like SNEAKEY to generate key codes without physically measuring the key.

You’ll of course need OpenSCAD to edit the .scad file and generate an STL to print out, unless your key just happens to be 33172 like the example STL posted below.  If it is, you can unlock the doorknob currently sitting on my desk.  As a small, precise object, this is a great test of how accurate your Skeinforge settings are.  You may need to adjust some thicknesses or the built in pin depth fudge factor to get it working properly with your printer.  The pictures above show the key being used on a disconnected lock cylinder, but I found it was also strong enough to turn a deadbolt.  If your lock needs a lot of force to turn, you may need to cut a space into the key to use a torsion wrench with it.

Download:
sc1.scad
sc1.stl
kw1.scad
kw1.stl