My first encounter with Google Glass came on a Saturday in June, when I showed up at the Glass Explorers’ “Basecamp,” a sunny spread atop the Chelsea Market. My tech sherpa, a bright-eyed young woman, set me up with a mimosa as we perused the various shades of Glass frames, each named for a color that occurs in nature: cotton, shale, charcoal, sky, and tangerine. I went for shale, which happens to be the preference of Glass Explorers in San Francisco. (New Yorkers, naturally, go for bleak charcoal.) I was told that I was one of the first few hundred Explorers in the city, which made me feel like some third-rate Shackleton embarked on my own Nimrod Expedition into the neon ice. The lightweight titanium frames were fitted over my nose, a button was pressed near my right ear, and the small screen, or Optical Head Mounted Display, flickered to pink-ish life. I was told how to talk to my new friend, each command initiated with the somewhat resigned “O.K., Glass.” In deference to Eunice and Lenny, I started off with two simple instructions, picked up by a microphone that sits just above my right eye, at the tip of my eyebrow.
“O.K., Glass. Google translate ‘hamburger’ into Korean.”
“Haembeogeo,” a gentle, vowel-rich voice announced after a few seconds of searching, as both English and Hangul script appeared on the display above my right eye. Since there are no earbuds to plug into Glass, audio is conveyed through a “bone conduction transducer.” In effect, this means that a tiny speaker vibrates against the bone behind my right ear, replicating sound. The result is eerie, as if someone is whispering directly into a hole bored into your cranium, but also deeply futuristic. You can imagine a time when different parts of our bodies are adapted for different needs. If a bone can hear sound, why can’t my fingertips smell the bacon strips they’re about to grab?
Adafruit publishes a wide range of writing and video content, including interviews and reporting on the maker market and the wider technology world. Our standards page is intended as a guide to best practices that Adafruit uses, as well as an outline of the ethical standards Adafruit aspires to. While Adafruit is not an independent journalistic institution, Adafruit strives to be a fair, informative, and positive voice within the community – check it out here: adafruit.com/editorialstandards
Stop breadboarding and soldering – start making immediately! Adafruit’s Circuit Playground is jam-packed with LEDs, sensors, buttons, alligator clip pads and more. Build projects with Circuit Playground in a few minutes with the drag-and-drop MakeCode programming site, learn computer science using the CS Discoveries class on code.org, jump into CircuitPython to learn Python and hardware together, TinyGO, or even use the Arduino IDE. Circuit Playground Express is the newest and best Circuit Playground board, with support for CircuitPython, MakeCode, and Arduino. It has a powerful processor, 10 NeoPixels, mini speaker, InfraRed receive and transmit, two buttons, a switch, 14 alligator clip pads, and lots of sensors: capacitive touch, IR proximity, temperature, light, motion and sound. A whole wide world of electronics and coding is waiting for you, and it fits in the palm of your hand.
Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7:30pm ET! To join, head over to YouTube and check out the show’s live chat and our Discord!
Python for Microcontrollers – Adafruit Daily — Select Python on Microcontrollers Newsletter: PyCon AU 2024 Talks, New Raspberry Pi Gear Available and More! #CircuitPython #Python #micropython @ThePSF @Raspberry_Pi
EYE on NPI – Adafruit Daily — EYE on NPI Maxim’s Himalaya uSLIC Step-Down Power Module #EyeOnNPI @maximintegrated @digikey