0

2 Kinects 1 Box – Amazing real time 3D depth camera work


2 Kinects 1 Box – Amazing real time 3D depth camera work…

First test of merging the 3D video streams from two Kinect cameras into a single 3D reconstruction. The cameras were placed at an angle of about 90 degrees, aimed at the same spot in 3D space.

The two cameras were calibrated internally using the method described in the previous video, and were calibrated externally (with respect to each other) using a flat checkerboard calibration pattern and manual measurements.


Make a robot friend with Adafruit’s CRICKIT – A Creative Robotics & Interactive Construction Kit. It’s an add-on to our popular Circuit Playground Express, FEATHER and other platforms to make and program robots with CircuitPython, MakeCode, and Arduino. Start controlling motors, servos, solenoids. You also get signal pins, capacitive touch sensors, a NeoPixel driver and amplified speaker output. It complements & extends your boards so you can still use all the goodies on the microcontroller, now you have a robotics playground as well.

Join 7,500+ makers on Adafruit’s Discord channels and be part of the community! http://adafru.it/discord

CircuitPython in 2018 – Python on Microcontrollers is here!

Have an amazing project to share? Join the SHOW-AND-TELL every Wednesday night at 7:30pm ET on Google+ Hangouts.

Join us every Wednesday night at 8pm ET for Ask an Engineer!

Follow Adafruit on Instagram for top secret new products, behinds the scenes and more https://www.instagram.com/adafruit/


Maker Business — Fewer startups, and other collateral damage from the 2018 tariffs

Wearables — Light as a Worbla feather

Electronics — How to make your own magnetic field probe!

Biohacking — The State of DNA Analysis in Three Mindmaps

Python for Microcontrollers — One year of CircuitPython weeklies!

Get the only spam-free daily newsletter about wearables, running a "maker business", electronic tips and more! Subscribe at AdafruitDaily.com !



2 Comments

  1. Awesome, now that there are 2 cameras, why not using 4? 90° separated aswell.

  2. This is very interesting.

    The synthetic view is generated as seen from a camera which is actually non-existent. A virtual camera positioned wherever you like. The scene is built from the 3d knowledge on the environment taken from the two actual cameras.

    I would like to apply these algorithms to one-to-one videoconferencing so to build a virtual camera inside the screen so to solve the eye-contact problem and be able to video call looking into each other eyes.

    Can you post a video with a person face as taken from the virtual cam, to see how it is rendered? Also it could be interesting to change the angle of the two cams, to something less than 90 degrees. (maybe 45-60 degrees).

    Here is a similar experiment of mine with a screen scene changing basing on user face position.

    http://marco.guardigli.it/2010/01/screen-view-change-basing-on-user-face.html

    Marco ( @mgua )

Sorry, the comment form is closed at this time.