Black Lives Matter - Action and Equality. ... Adafruit is open and shipping.
0

The Open Kinect project – THE OK PRIZE – get $3,000 bounty for Kinect for Xbox 360 open source drivers

Pt 10508

Hi from team Adafruit, we’re going to do our first ever “X prize” type project. Hack the Kinect for Xbox 360 and claim the $2,000 bounty! NOW $3,000

What is Kinect?

Kinect for Xbox 360, or simply Kinect (originally known by the code name Project Natal (pronounced /nəˈtɒl/ nə-tahl)), is a “controller-free gaming and entertainment experience” by Microsoft for the Xbox 360 video game platform, and may later be supported by PCs via Windows 8. Based around a webcam-style add-on peripheral for the Xbox 360 console, it enables users to control and interact with the Xbox 360 without the need to touch a game controller through a natural user interface using gestures, spoken commands, or presented objects and images. The project is aimed at broadening the Xbox 360’s audience beyond its typical gamer base. It will compete with the Wii Remote with Wii MotionPlus and PlayStation Move motion control systems for the Wii and PlayStation 3 home consoles, respectively. Kinect is scheduled to launch worldwide starting with North America in November.

What is the hardware?

The Kinect sensor is a horizontal bar connected to a small base with a motorized pivot, and is designed to be positioned lengthwise below the video display. The device features an “RGB camera, depth sensor and multi-array microphone running proprietary software”, which provides full-body 3D motion capture, facial recognition, and voice recognition capabilities.

According to information supplied to retailers, the Kinect sensor outputs video at a frame rate of 30 Hz, with the RGB video stream at 32-bit color VGA resolution (640×480 pixels), and the monochrome video stream used for depth sensing at 16-bit QVGA resolution (320×240 pixels with 65,536 levels of sensitivity). The Kinect sensor has a practical ranging limit of 1.2–3.5 metres (3.9–11 ft) distance. The sensor has an angular field of view of 57° horizontally and a 43° vertically, while the motorized pivot is capable of tilting the sensor as much as 27° either up or down. The microphone array features four microphone capsules, and operates with each channel processing 16-bit audio at a sampling rate of 16 kHz.

How does it work?
Wired has a great article about!

Canesta-Howitworks1

Sound cool? Imagine being able to use this off the shelf camera for Xbox for Mac, Linux, Win, embedded systems, robotics, etc. We know Microsoft isn’t developing this device for FIRST Robotics, but we could! Let’s reverse engineer this together, get the RGB and distance out of it and make cool stuff! So……

What do we (all) want?
Open source drivers for this cool USB device, the drivers and/or application can run on any operating system – but completely documented and under an open source license. To demonstrate the driver you must also write an application with one “window” showing video (640 x 480) and one window showing depth. Upload all of this to GitHub.

How get the bounty ($3,000 USD)
Anyone around the world can work on this, including Microsoft 🙂 Upload your code, examples and documentation to GitHub. First person / group to get RGB out with distance values being used wins, you’re smart – you know what would be useful for the community out there. All the code needs to be open source and/or public domain. Email us a link to the repository, we and some “other” Kinect for Xbox 360 hackers will check it out – if it’s good to go, you’ll get the $3,000 bounty!

Update: We’ve increased it to $3,000 – why? We just read this at CNET

But Microsoft isn’t taking kindly to the bounty offer. Bounty offered for open-source Kinect driver – “Microsoft does not condone the modification of its products,” a company spokesperson told CNET. “With Kinect, Microsoft built in numerous hardware and software safeguards designed to reduce the chances of product tampering. Microsoft will continue to make advances in these types of safeguards and work closely with law enforcement and product safety groups to keep Kinect tamper-resistant.”


We are angry, frustrated, and in pain because of the violence and murder of Black people by the police because of racism. We are in the fight AGAINST RACISM. George Floyd was murdered, his life stolen. The Adafruit teams have specific actions we’ve done, are doing, and will do together as a company and culture. We are asking the Adafruit community to get involved and share what you are doing. The Adafruit teams will not settle for a hash tag, a Tweet, or an icon change. We will work on real change, and that requires real action and real work together. That is what we will do each day, each month, each year – we will hold ourselves accountable and publish our collective efforts, partnerships, activism, donations, openly and publicly. Our blog and social media platforms will be utilized in actionable ways. Join us and the anti-racist efforts working to end police brutality, reform the criminal justice system, and dismantle the many other forms of systemic racism at work in this country, read more @ adafruit.com/blacklivesmatter

Stop breadboarding and soldering – start making immediately! Adafruit’s Circuit Playground is jam-packed with LEDs, sensors, buttons, alligator clip pads and more. Build projects with Circuit Playground in a few minutes with the drag-and-drop MakeCode programming site, learn computer science using the CS Discoveries class on code.org, jump into CircuitPython to learn Python and hardware together, TinyGO, or even use the Arduino IDE. Circuit Playground Express is the newest and best Circuit Playground board, with support for CircuitPython, MakeCode, and Arduino. It has a powerful processor, 10 NeoPixels, mini speaker, InfraRed receive and transmit, two buttons, a switch, 14 alligator clip pads, and lots of sensors: capacitive touch, IR proximity, temperature, light, motion and sound. A whole wide world of electronics and coding is waiting for you, and it fits in the palm of your hand.

Join 20,000+ makers on Adafruit’s Discord channels and be part of the community! http://adafru.it/discord

Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7pm ET! To join, head over to YouTube and check out the show’s live chat – we’ll post the link there.

Join us every Wednesday night at 8pm ET for Ask an Engineer!

Follow Adafruit on Instagram for top secret new products, behinds the scenes and more https://www.instagram.com/adafruit/

CircuitPython – The easiest way to program microcontrollers – CircuitPython.org


Maker Business — To make it through a tough business cycle, layoffs should be a last resort

Wearables — Everything in its place

Electronics — The Case Of The Disappearing Capacitance

Python for Microcontrollers — Python on Microcontrollers Newsletter: New Hardware, Python Releases and Much More! #Python #Adafruit #CircuitPython @circuitpython @micropython @ThePSF

Adafruit IoT Monthly — BLE Store Capacity Indicator, Aquarium Automation, and more!

Microsoft MakeCode — Virus Destroyer!

EYE on NPI — Maxim’s Himalaya uSLIC Step-Down Power Module #EyeOnNPI @maximintegrated @digikey

New Products – Adafruit Industries – Makers, hackers, artists, designers and engineers! — NewProducts 8/6/2020 Feat. #Adafruit #ST25DV16K I2C #RFID #EEPROM #Breakout – STEMMA QT / Qwiic!

Get the only spam-free daily newsletter about wearables, running a "maker business", electronic tips and more! Subscribe at AdafruitDaily.com !



81 Comments

  1. I love you guys for doing this! Depth cameras, sometimes called RGB-D cameras are extremely useful for robotics. For example, work by Dieter Fox (etal) have used these sensors to create a system that is basically a Google StreetView indoors.

    These sensors normally cost many thousands of dollars, so the Kinect will be a _big_ deal for roboticists — especially with an open API. Please ping me when you have a winner so that I can spread the word to the professional / academic robotics world.

  2. I wish you guys would make this more like an X-prize and get donations from a number of organizations instead of just doing it solo at adafruit. As a foundation you could solicit prize donations from every DIY electronics or robotics firm out there… I bet willow garage would put up some cash for example.

  3. @james – wish granted. congrats, this is now your job. please make that happen and consider the first $1k from us

  4. Why a prize? You know there are those of us out there working on this stuff for free, we have for years, and you were going to get a driver out of this no matter what. Instead of just pointing them to us and creating a community, now we’re going to have to worry about people not distributing whatever USB traces (since this is probably gonna be easier with a USB analyzer) they have so they can net the cash quicker than than everyone else, which means this is not going to be a community effort.

    This isn’t a case of everyone working together but still having a bounty like the DS Wifi hack, because it hasn’t been long enough to even gain that sort of attention.

    Anyways, I’m going to put my lack of money where my mouth is. All information I obtain, when I obtain it, will be at:

    http://www.github.com/qdot/libkinect

  5. @James: If you do this, I’ll contribute another $50 to the prize fund.

  6. God damn. You’re my hero, LadyAda.

    Can’t wait for my own army of quadcopters with 3d sensing….

  7. wouldn’t it be great to have a donation pool where others can add their money too, so the bounty can grow further?

  8. @DooMMasteR – that may happen. so far today’s it’s double, that’s pretty good

  9. Awesome, I’ll spread this to all the hackerspaces. I’d be great if we could all work together and crack this. 🙂

  10. Very nice. thanks guys this will be a game changer (or at least change it from being a game). What would we do with an NUI that can also calculate depth perception look forward to the driver and an API so we can all find out.

  11. If you make some kind of open pot, I will put some money in it.

  12. Drivers aside – it seems like the most obvious hack for this will be a movement/sound sensitive security camera. Maybe I’ll pick it up tomorrow and get to work on it… $3000?

  13. Please. Take the money, buy a USB analyzer, analyze along with recording input via another camera, and hand us some dumps to work with. This is NOT GOING TO HAPPEN without hardware analyzers, they are expensive (I’m probably going to have to rent one if someone else doesn’t post the needed data), and if you don’t help provide the information, why would anyone share with that much cash on the line. Stop throwing money at the situation and start helping the engineering. I /know/ you guys have access to this stuff.

  14. what about using kickstarter to fund this?

  15. @ted – that’s an option that cab be used later if it’s needed…

  16. You guys are Awesome!!!! This is exactly why I love open source hardware.

  17. Encouraging upload of free software to non-free hosting sites is bad. Maybe you should read this:

    http://mako.cc/writing/hill-free_tools.html

  18. I don’t even own a 360 but if I can play with this thing on my pc or android I’ll get one. I wonder how good it could be at 3d scanning.

  19. I still refuse to buy any closed source console.

  20. I’d donate money to the prize amount. I’m a lot of other would too. Why don’t you set up a donation jar?

  21. It’s going to see sails take off like a rocket then find out that no one uses it with on the xbox.

  22. I, for one, am pumped at the ability to browser the web with hand gestures in the near future. I expect full drivers by tomorrow morning with this capability built in.

  23. Awesome, guys. Ever since MS started talking about Natal, I thought forget video games, I want one of these things for my robot!

  24. Awesome. Where do I donate to the bounty?

  25. it confuses the heck out of this for me why folks would try to muddy up a piece of hardware just to make it unhackable. antifeatures at work, geez. all it will mean is more folks buying this gizmo which should only benefit the manufacturer? dont they like money? Its not like you’d return a hacked gizmo if it got broken. good luck to all who try to get this thing accessible for floss folks!

  26. +1 to kickstarter. Make a kickstarter project to raise a prize bounty, and you’ll get serious moola I think.

  27. I kinda agree with Kyle here, if you are interested in an Open Source driver for it, you can consider sponsoring well known developers donating hardware (usb analyzers or the kinect toy itself) and setting up an information sharing playground, a wiki on openkinect.org would be OK if there is not one already.

    I helped writing the Linux driver for the Playstation 3 Eye, and I did that only because I happened to have the camera already and people published their usb traces somewhere.

    My 2c,
    Antonio

  28. I love it – the mouse jerking (hard) on the behemoth’s tail. And got their attention, ataboy ada!!!!!!!!

    you are my hero…

  29. Another vote here for crowd sourced funding. Microsoft’s attitude is clearly in need of correction.

  30. Sounds intriguing. This might be a worthwhile excuse and cause to actual purchase one of these things.

  31. i’m digging my gear out now, 2k is a bit low if you look at the amount of time that needs to be posted. i wonder what m$ implemented to keep it locked down. for what ever its worth, ill submit a few dumps, since im a college student, i dont think i can place 2 grand in front of school work, ill submit what ever i get to help the cause.

  32. linking to this for ya…good luck

  33. Any word on a donation site? I’m no hacker but this is something I will get behind monitarily.

  34. $2k isn’t nearly what is needed to crack this. I don’t think you guys realize who Microsoft works with on securing these.

  35. Donations should be taken to add to the prize money. I don’t know how much publicity this kind of thing will get but if 2000 people each donate one measly dollar then the prize is doubled! I’d gladly throw some money in the pot if I could. I want to see this completed but I’m no programmer.

  36. I don’t think you guys understand. The $2000 isn’t a salary, it’s a prize. Be the first to do it and you’ll get $2K. It’s meant to be an incentive for those who were thinking about doing this, not a wage to get people to quit their day job.

    If you considered hacking Kinect, here’s an incentive to get on it. May the best person/team win.

  37. You are all going to be very disappointed. Everything clever the ‘kinect’ does, it does with proprietary software on the xbox360. There is no depth camera, just an ordinary monocrome webcam reading a projected (fixed) infra-red pattern projected on the scene. The main purpose of this camera is to rapidly identify a silhouette of a human. Movement is largely identified by using statistical motion vector software (a technique used for years by previous console webcams). The monochrome camera allows the software in the xbox to ‘know’ which small part of the colour camera image to process when searching for motion vectors.

    The projected IR pattern allows crude identification of z-depth motion, but the resolution of the mono-camera should be a clue as to how limited this data is.

    The skeletal tracking is largely an AI system on the xbox360, and is a statistical assumptive algorithm, rather than any absolute measurement. Research into calculating limb position from simple body outline images pre-dates the xbox console by a very long time.

    Interestingly, several years before the kinect, MS gave massive promotion to the Codemaster’s game ‘You’re in the movies’. This game, using only the standard webcam, identified the silhouettes of the players in front of the camera with a high degree of accuracy, allowing ‘green-screening techniques to do background substitution, without the need for a green-screen. This shows that much of what the kinect does in hardware was already redundant using modern image processing algorithms.

    However, the kinect obviously makes such methods far more robust, at the cost of mechanical complexity (and a big spend by the consumer).

    Had MS been serious about general reading of z-depth, it would have deployed a 3 camera system, with the left and rightmost cameras being discrete and individually positionable, like speakers. These 2 would provide a ‘stereo pair’ of images that would allow depth to be identified at each pixel position by using perspective variation algorithms that could be simply accelerated in hardware. However, setting up 3 discrete boxes would have been a pain for most consumers under 10, or over 20, and lost MS most of its intended audience.

    As an aside, many of us know that the companied MS allied with to do the ‘depth’ system was supposedly just about to launch a cheap z-depth camera for the PC at the time. However, I’m sure that camera was intended for objects way closer than 7-feet away, where the sharpness of the projected IR grid pattern would have returned much better information. Can the kinect work with objects placed much closer? Probably not if one wishes to use the colour camera as well. Then there is the focus issue of the optics, and the ‘sharpness’ of the IR pattern.

    The bottom line is that the kinect is not like the wii-remotes and sony-moves giving us access to remarkably cheap and robust combinations of gyroscopes, accelerometers, and cameras. For visual processing, kinect is 90%+ a software system on the xbos360 side. Kinect is a bunch of simple hardware choices designed to assist the software. And what does that software drive? Largely a load of silly, imprecise casual games for people that can’t even bring themselves to take gaming seriously. That should inform you about the likely engineering choices, and their usefulness in other areas.

    Believe me, motion tracking studios won’t be replacing their multi camera setups, and body-suits, with anything like the tech in the kinect.

    PS like everyone here, I hope the kinect is turned into another ‘open’ USB device for everyone to exploit, as soon as possible. It is just that for visual image processing, it is already cheaper to buy multiple high quality USB cameras, and infra-red LEDs, and experiment with readily available open-source software. This just was not true of the tech in the wii-remote.

  38. At lease the calculation of the depth map will happen on the device itself. I think the PrimeSense chip (see iFixit tear downb (1)) calculates the depth map, what else should it be good for?! This calculation can be cumbersome and take quite some processing time. Furthermore the CPU load on the xbox is low during kinect usage, which also indicates that some more computer vision is implemented in the hardware part (2). Additionally the depth calculation using projected IR light more robust than just using a pair of stereo cameras. Even a longer baseline (distance between the cameras of a stereo pair) as suggested by mark would increase the accuracy of such a system but a the same time create a lot of mismatches in the calculation of the depth map, which will reduce the quality of such a system. Commercially available stereo cameras (e.g. (3)) usually use a much shorter baseline and cost quite some dollars/euros (>> 150€, much more than 150€).

    In the end i want to say: It is definitely worth having a look at the kinect hardware. I hope someone really hacks that thing. Of course I would spend some bucks on that 🙂

    (1) http://www.ifixit.com/Teardown/Microsoft-Kinect-Teardown/4066/
    (2) http://hothardware.com/News/Microsoft-Kinect-CPU-Usage-Down-to-Single-Digits-/
    (3) http://www.ptgrey.com/products/bumblebee2/index.asp

    ps: i love the resistor test on the bottom 🙂

  39. I created USB drivers and have the Xbox NUI Motor, Xbox NUI Camera, and Xbox NUI Audio devices connected to my PC. I can communicate with them about standard USB protocol stuff (get descriptors, enumerate interfaces, etc.) and can see the pipes that I need to get data on. The Kinect itself just shows a flashing green light, which means it is waiting for some commands to start going.

    I need the USB protocol dumps in order to make further progress, but don’t have access to a USB protocol analyzer. If anyone has one please share it. I’ve tried brute-forcing a few likely commands but most of the incorrect commands cause device resets. Contact me on twitter if you have info: @joshblake.

  40. If you set up donations I’ll donate $100 to the bounty. Fck Microsoft.

  41. Looks like folks would benefit from looking here:

    http://www.primesense.com/?p=535

    Seems PrimeSense is providing the base reference design to MS for the RGBD technology.

  42. How about spending this money (you’ll need way more) to create your own 3D sensor? Should be easier than reverse engineer the most complicated vision tech out there… I don’t think you can move far just by cracking the camera. Ask MS for Windows drivers 🙂

  43. @mark whats that big old DSP doing there then?

  44. The PrimeSense IC connects to the host via built in USB. Its functions include adding a depth overlay to a colour image using a second IR camera and IR projector and presumably correcting that overlay for parallax error caused by using separate cameras. The promotional material on their website suggests it also does skeletal mapping using “highly parallel computational logic”, allowing the host system to do gesture recognition.

    The PrimeSense reference design is USB powered so must fit within the USB power budget. Natal/Kinect has a USB hub and a cooling fan so there’s obviously a lot more going on in there.

    The PrimeSense chip has only two microphone inputs but Kinect has four microphones. Perhaps they are used to steer the camera towards the player using DSP which is happening on the host system.

  45. I’d like to add $100 to the prize fund.

  46. I wholly support this project. I find this to be a great issue to stand up against the manufacturers of proprietary hardware. It’s almost awesome enough to run around shouting “Hack the Planet” over.

  47. If you want to open this up to other people throwing in the pot I would be happy to pitch in. Kinect would be fun to play with outside of the XboX

  48. From what I’ve researched it works on a really simple way. Have a look here: http://www.youtube.com/watch?v=nvvQJxgykcU
    The projector projects tiny dots, and the sensor measures the dots size. That explains the low 320×240 resolution. The sensor’s native resolution is probably a lot higher.
    Anyway that doesn’t really help on anything as long as the camera itself delivers everything already well formed. It’s just a question of sniffing the USB port probably.

  49. “You are all going to be very disappointed. Everything clever the ‘kinect’ does, it does with proprietary software on the xbox360. There is no depth camera, just an ordinary monocrome webcam reading a projected (fixed) infra-red pattern projected on the scene.”

    Well a lot of people in the know beg to differ. Current ToF cameras used for robotics cost > 5000$ and don´t even have half the resolution of the Kinect´s sensor. Even if the sensor data is noisy, this device will be extremely competetive with sensors that normaly cost 40x as much.
    You´ll probably want to take a look at this:
    http://www.hizook.com/blog/2010/03/28/low-cost-depth-cameras-aka-ranging-cameras-or-rgb-d-cameras-emerge-2010

  50. The ZCam [1] initially was going to become a low-cost USB device. It is a low quality time of light camera. I was a bit disappointed when they made an exclusive deal with Microsoft.

    [1] http://en.wikipedia.org/wiki/ZCam

Sorry, the comment form is closed at this time.