As it stands, Google Glass doesn't have a simple way of cataloging real-world items -- you have to snap a picture and make a note afterward. It may get much easier if Google implements a newly granted US patent, however. The technique uses a wearable display's camera to detect hand gestures made in front of objects. Make a heart shape and you'll "like" what's front of you; frame something with your fingers and you'll select it. There's no certainty that Glass will ever support these commands, but they're certainly intuitive. If nothing else, they could lead to a new, very literal take on Google Goggles.
http://www.engadget.com/2013/10/15/g...heart-objects/
There are currently 1 users browsing this thread. (0 members and 1 guests)
Bookmarks