0

September 27, 2016 AT 2:00 am

IBM and MIT partner up to create AI that understands sight and sound the way we do #makereducation

Shutterstock 228897490

Exciting new project from IBM & MIT!

Via TechCrunch:

When you see or hear something happen, you can instantly describe it: “a girl in a blue shirt caught a ball thrown by a baseball player,” or “a dog runs along the beach.” It’s a simple task for us, but an immensely hard one for computers — fortunately, IBM and MIT are partnering up to see what they can do about making it a little easier.

The new IBM-MIT Laboratory for Brain-inspired Multimedia Machine Comprehension — we’ll just call it BM3C — is a multi-year collaboration between the two organizations that will be looking specifically at the problem of computer vision and audition.

Read more.


Adafruit_Learning_SystemEach Tuesday is EducationTuesday here at Adafruit! Be sure to check out our posts about educators and all things STEM. Adafruit supports our educators and loves to spread the good word about educational STEM innovations!


Check out all the Circuit Playground Episodes! Our new kid’s show and subscribe!

Have an amazing project to share? Join the SHOW-AND-TELL every Wednesday night at 7:30pm ET on Google+ Hangouts.

Join us every Wednesday night at 8pm ET for Ask an Engineer!

Learn resistor values with Mho’s Resistance or get the best electronics calculator for engineers “Circuit Playground”Adafruit’s Apps!


Maker Business — SoftBank Invests $300 Million in WeWork

Wearables — Foam feathers

Electronics — Have the need for speed? This diode might be right for you

Biohacking — Daptly to Release a Commercial Smart Mirror

Get the only spam-free daily newsletter about wearables, running a "maker business", electronic tips and more! Subscribe at AdafruitDaily.com !


No Comments

No comments yet.

Sorry, the comment form is closed at this time.