The exterior of the F-1 was meticulously photographed and then mapped with a structured light scanning rig, which uses a projector to paint a pattern of stripes onto the surface being scanned. Mounted on the side of the projector are two cameras which each record how the pattern falls on the surface being scanned. For every exposure, the projectors and camera capture sixteen different stripe patterns.
The structured light rig can focus on an area ranging from 65mm in size all the way out to 1.5 meters, so getting a surface scan of the entire F-1 engine required a lot of crawling around and manually aiming the scanner rig. It wasn’t immediately obvious how a bunch of handheld scan images could maintain coherency—how do you indicate to the scanner that picture 2 is linked to picture 1?
Enlarge / Scanning a small object with the structured light scanner. The alternating horizontal bars in the purple projection are used by the cameras to detect surface detail. Note positional decals around the object’s perimeter.
The answer was both simple and brilliant. “You notice these little targets? These little stickers?” said Shape Fidelity engineer Rob Black, who was demonstrating the equipment for me. Black indicated the small white-on-black circular dots pasted on the test material on the table in front of us. All around us were disassembled F-1 engine pieces, and I noticed that every single component was peppered with the little dots. “We stick these things on by hand, and the scanner sees these targets, so when we move from one position to the next, it can see what’s coming… and stitch everything together. That way, we don’t have to use encoders or robots.”
Each part gets tiny dots hand-applied in what is effectively a random, unique pattern. The structured light software can use the unique layout of dots to stitch together all pictures of the object being scanned, without requiring the camera to be mounted in a motion controlled rig.
“We took just a regular digital camera and walked around the engine and took photographs,” Black said. “The software took all those photographs and built a 3D coordinate for each of the targets, and what you get is a very sparse data set—it’s basically the X-Y-Z value of the center of these points.”
Every Thursday is #3dthursday here at Adafruit! The DIY 3D printing community has passion and dedication for making solid objects from digital models. Recently, we have noticed electronics projects integrated with 3D printed enclosures, brackets, and sculptures, so each Thursday we celebrate and highlight these bold pioneers!
Have you considered building a 3D project around an Arduino or other microcontroller? How about printing a bracket to mount your Raspberry Pi to the back of your HD monitor? And don’t forget the countless LED projects that are possible when you are modeling your projects in 3D!
The Adafruit Learning System has dozens of great tools to get you well on your way to creating incredible works of engineering, interactive art, and design with your 3D printer! If you’ve made a cool project that combines 3D printing and electronics, be sure to let us know, and we’ll feature it here!
Have an amazing project to share? Join the SHOW-AND-TELL every Wednesday night at 7:30pm ET on Google+ Hangouts.
Join us every Wednesday night at 8pm ET for Ask an Engineer!
Learn resistor values with Mho’s Resistance or get the best electronics calculator for engineers “Circuit Playground” – Adafruit’s Apps!
Maker Business — Alibaba to invest $15b in tech, set up research labs around the world
Wearables — Special servo movement
Electronics — Trigger happy oscilloscope?
Biohacking — Biohacking: Visioneer – AI Glasses to Assist the Visually Impaired
No comments yet.
Sorry, the comment form is closed at this time.