Well, more like an “un-wearable” I guess! Via Mashable and MIT Technology Review.
Those of us who need glasses to see a TV or laptop screen clearly could ditch the eyewear thanks to a display technology that corrects vision problems.
The technology uses algorithms to alter an image based on a person’s glasses prescription together with a light filter set in front of the display. The algorithm alters the light from each individual pixel so that, when fed through a tiny hole in the plastic filter, rays of light reach the retina in a way that re-creates a sharp image. Researchers say the idea is to anticipate how your eyes will naturally distort whatever’s onscreen—something glasses or contacts typically correct—and adjust it beforehand so that what you see appears clear.
Brian A. Barsky, a University of California, Berkeley, computer science professor and affiliate professor of optometry and vision science who coauthored a paper on it, says it’s like undoing what the optics in your eyes are about to do. The technology is being developed in collaboration with researchers at MIT and Microsoft.
In addition to making it easier for people with simple vision problems to use all kinds of displays without glasses, the technique may help those with more serious vision problems caused by physical defects that can’t be corrected with glasses or contacts, researchers say. This includes spherical aberration, which causes different parts of the lens to refract light differently.
While similar methods have been tried before, the new approach produces a sharper, higher-contrast image. The paper on the research that Barsky and others wrote will be presented at the annual International Conference and Exhibition on Computer Graphics and Interactive Techniques, also known as Siggraph, in Vancouver, Canada, in August.
For the paper, researchers took images of things like a rainbow-colored hot-air balloon and a detail of a Vincent Van Gogh self-portrait and applied algorithms that warped the image by taking into account the specific eye condition it was told to account for. They then showed the images on an iPod Touch, to whose display they had affixed an acrylic slab topped with a plastic screen pierced with thousands of tiny, evenly spaced holes.
Gordon Wetzstein, who coauthored the paper while a research scientist at MIT’s Media Lab, says the screen allows a regular two-dimensional display to work as what’s known as a “light field display.” This means the screen controls the way individual light rays emanate from the display, leading to a sharper image without degrading contrast.
The researchers tested out their device by using a Canon DSLR camera with the focus set to simulate vision problems like farsightedness.
Every Wednesday is Wearable Wednesday here at Adafruit! We’re bringing you the blinkiest, most fashionable, innovative, and useful wearables from around the web and in our own original projects featuring our wearable Arduino-compatible platform, FLORA. Be sure to post up your wearables projects in the forums or send us a link and you might be featured here on Wearable Wednesday!