BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Facebook Lab Reveals Direction Of AR Smartglasses

Following
This article is more than 3 years old.

Facebook Reality Labs (FRL) takes the wraps off pioneering work on the AI and haptics needed for all day every day wearable AR glasses. To have practical use, a number of technologies would need to converge in the device: 5G, AI, an advanced optics system, a new kind operating system and user interface for this wearable computer. FRL has neither a deadline, nor a product launch date. They are on the cutting edge of exploration of what’s really possible. Like University researchers, they are publishing their findings in peer-reviewed scientific journals, not keeping them a secret. A consumer product is the ideal at the moment, not a prototype.  

FRL presented their concept of the “intelligent click,” a series of gestures, some large, some nearly unconscious nerve impulses, detected by a wrist band. This would communicate intent to the operating AI which would know, and anticipate, what the user needs to know, before the user knows they need it. FRL says its goal is a “human centered interface,” which will use preferences and surroundings to infer intent, creating an “ultra low friction” computing experience. “What we’re trying to do with neural interfaces is to let you control the machine directly, using the output of the peripheral nervous system — specifically the nerves outside the brain that animate your hand and finger muscles,” says FRL Director of Neuromotor Interfaces Thomas Reardon, who joined the FRL team when Facebook acquired CTRL-labs in 2019. 

Reardon explained FRL cares about the wrist because of its proximity to the hand, and its myriad gestures and nerves. The commands come down the arm from the brain creating an EMG signal which can be read by a wrist sensor. A wristband can also contain some of the computer’s components like batteries, antennas, and sensors. The signals through the wrist are so clear that EMG can understand finger motion with millimeter accuracy. Someday it may even be possible to sense just the intention to move a finger. 

The researchers at FRL emphasized that they are guided by a vision they cannot achieve by themselves, and only through a collective, collaborative effort can the goal of invisible computing be realized. “Understanding and solving the full extent of ethical issues requires society-level engagement,” says FRL Research Science Director Sean Keller. “We simply won’t get there by ourselves, so we aren’t attempting to do so. As we invent new technologies, we are committed to sharing our learnings with the community and engaging in open discussion to address concerns.”

“That’s why we support and encourage our researchers to publish their work in peer-reviewed journals — and why we’re telling this story today. We believe that far before any of this technology ever becomes part of a consumer product, there are many discussions to have openly and transparently about what the future of HCI can and should look like. “We think deeply about how our technologies can positively and negatively impact society, so we drive our research and development in a highly principled fashion,” says Keller, “with transparency and intellectual honesty at the very core of what we do and what we build.”

Follow me on Twitter or LinkedInCheck out my website or some of my other work here