Augmented Reality for 3D Printing

Yoda
My roommate’s struggles designing his first 3D printed part gave me the idea to write an augmented reality viewer that lets you preview and interact with STL models in the real world without having to commit an object to plastic. This is actually sort of an update on part of a project I did for a Computational Photography course three years ago, but not terrible looking this time. I used the ArUco library to track the fiducial markers, largely because there is a javascript version if I ever want to make it web based. The program, which I uncreatively named arstl, reads in ASCII and binary STL files and displays them on top of the tracked marker. Right now, it uses a pretty basic OpenGL shader for a shiny plastic look, but I plan on making a more convincingly plastic one with bump mapping and subsurface scattering soon. As usual, the code is up on github under an ISC License. The STL parsing part of it is in the public domain, in case anyone finds it useful.

2 Comments on Augmented Reality for 3D Printing

Science on a Snow Globe Spherical Display

Science On a Sphere is a NOAA project typically using four projectors to display planetary data on a six foot diameter sphere.  As a federal agency, NOAA publishes data that is not copyrightable.  These public domain datasets are pretty impressive, ranging from plate tectonics to solar storms.  They are also insanely high resolution, with mp4 videos and jpg images at 2048×1024 and 4096×2048.

To shrink this four projector, five computer, high resolution science center exhibit down to a picoprojector, old laptop, bathroom lighting fixture setup, I had to move beyond my unoptimized python scripts to SDL, OpenGL, libvlc, and GLSL.  I wrote a program called sosg, Science On a Snow Globe, which reads in images and videos and displays them in the correct format for Snow Globe.  Doing the equirectangular to fisheye transform in a fragment shader is extremely lightweight, even with GMA graphics.  Using libvlc makes video decoding quite performant as well, despite the resolution.

The program is the closest I’ve written to “shippable” in recent memory, but there are some rough spots.   I ran into a bottleneck trying to decode high resolution jpgs in real time, so currently sosg does not support the image slideshows that the full size SOS does.  It also doesn’t attempt to read in .sos playlist information.  Basically, it is not an SOS replacement, just a cheap and cheerful way to view planetary data on your desktop. Unlike the original, it is also available under a permissive license and can be cloned directly from git://github.com/nrpatel/SnowGlobe.git.

8 Comments on Science on a Snow Globe Spherical Display

Cheap 3D in OpenGL with a ChromaDepth GLSL Shader

Chromadepth

I have probably stated in the past that I don’t do 3d.  As of a few months ago, this is no longer accurate.  Between Deep Sheep, a computer graphics course, and a computer animation course, I have grown an additional dimension.  This dimension is now bearing dimensional fruit.

After watching Coraline in 3d over spring break, I became obsessed with the possibilities of 3d.  Of course, as a student, I can’t quite afford fancy glasses or a polarized projector.  My budget is a bit closer to the free red/blue glasses one might find in a particularly excellent box of cereal.  Searching around the internet, and eBay specifically, I came across a seemingly voodoo-like 3d technology I’d never heard of:  cardboard glasses with clear looking plastic sheet lenses that could turn hand drawn images 3d.  This technology is ChromaDepth, which makes red objects appear to float in air, while blue objects seem to recede into the background.  Essentially, it uses prisms to offset red and blue light by different amounts on each eye, giving an illusion of depth as your brain attempts to perceive it.  So, creating a ChromaDepth image is just a matter of coloring objects at different distances differently, which is something computers are great at!

Of course, I am far from the first person to apply a computer to this.  Mike Bailey developed a cool solution in OpenGL a decade ago which maps an HSV color strip texture based on the object’s depth in the image.  The downside there is that objects can no longer have actual textures.  Textures are pretty tricky with ChromaDepth, in that changing the color of an object will throw off the depth.

I wrote fragment and vertex shaders in GLSL that resolve this problem.  The hue of the resulting fragments depends only on the distance from the camera in the scene, with closer objects appearing redder continuing through the spectrum to blue objects in the distance.  The texture, diffuse lighting, specular lighting and material properties of the object set the brightness of the color, giving the illusion of shading and texture while remaining ChromaDepth safe.  My code is available below.  Sticking with the theme of picking a different license for each work, this is released under WTFPL.  I can’t release the source of the OpenGL end of it, as it is from a school project.  It should be fairly simple to drop into anything though.  You will need ChromaDepth glasses to see the effect, which you can get on eBay or elsewhere for <$3 each.

Download:
ChromaDepth GLSL Vertex Shader
ChromaDepth GLSL Fragment Shader

5 Comments on Cheap 3D in OpenGL with a ChromaDepth GLSL Shader