Google Glass's word on the street now easier to read
Google Glass's word on the street now easier to read
Jason Orlovsky and colleagues at Osaka University in Japan have developed a text display algorithm that places the current message – a tweet, your location or your walking speed, say – on the darkest region in view at any given moment and in a readable colour.
This is done using the headset's camera, which plots a constantly changing heat map of viable on-screen reading locations. The algorithm can also split up a message into two small dark regions either side of your field of view. "Twitter feeds or text messages could be placed throughout the environment in a logical manner, much like signs are placed on either side of a street," the developers say.
The team presented their work last month at the Intelligent User Interfaces conference in Santa Monica, California. With the launch of an early version of Glass due in the next month, such software is likely to be in demand.
-
- Sergeant
- Posts: 11
- Joined: Wed Jan 12, 2011 11:36 pm
Re: Google Glass's word on the street now easier to read
http://www.youtube.com/watch?v=V6Tsrg_EQMw what is the google glasses screen?crystal