Google's Project Glass: You ain't seen nothin' yet The augmented-reality Project Glass provides a peek into Google's ambitions with artificial intelligence and how AI software could further shake up mobile computing.
by Jason Trott
Google's Project Glass demo is certainly the coolest hardware demo so far this year. Behind the scenes is something equally intriguing: artificial-intelligence software.
The augmented-reality glasses, which Google co-founder Sergey Brin was spotted wearing yesterday, created a huge buzz Wednesday when Google released a video showing, from the wearer's perspective, how they could be used.
In the video, the small screen on the glasses flashes information right on cue, allowing the wearer to set up meetings with friends, get directions in the city, find a book in a store, and even videoconference with a friend. The device itself has a small screen above the right eye on wrap-around glasses which have no lenses.
For the most part, the augmented-reality glasses do what a person could do with a smartphone, such as look up information and socialize. But the demo also shows glimpses of an artificial-intelligence (AI) system working behind the scenes. It's the AI system that could make mobile devices, including wearable computers, far more powerful and take on more complex tasks, according to an expert.
"The new thing that Google was showing was the interaction model using new hardware, rather than truly showing the potential of such a device," said Lars Hard, the chief technology officer of AI software company Expertmaker. "AI can actually enhance and improve different decision situations."