Details of Google's Project Glass revealed in FCC report
- Published
New details of Google's forthcoming augmented reality headset have emerged in documents published by a US regulator.
A describes video playing on the device alongside audio running to a "vibrating element".
The description tallies with a patent filing suggesting it plays sound via "bone-conduction" tech rather than earbuds.
Developers are due to receive a test edition of the headset later this year.
Google has already begun holding hands-on events for selected software writers in San Francisco and New York ahead of the release.
It has previously said it intended to sell the eyewear to consumers before the end of 2014.
Wearable tech
Google is not the only firm betting on the appeal of head-up display units.
Motorola Solutions announced its HC1 headset computer system in October. The voice and gesture-controlled rig is targeted at maintenance engineers, the emergency services and other organisations.
Oakley recently launched Airwave - ski goggles with built-in sensors which provide information on an in-built screen about an owner's speed, the size of their jumps and what music they are listening to.
Several companies also showed off prototype "smart glasses" at the recent Consumer Electronics Show in Las Vegas.
They included Vuzix's Android-powered M100 smartphone display system featuring a small screen and video camera which is due for release before the end of 2013.
Microsoft has also filed a patent for a set of digital glasses that overlay information on top of the user's view, which it suggested could be used at sports matches and theatres.
Analysts at Juniper Research have predicted the international market for smart eyewear and other mobile wearable devices could be worth more than $1.5bn (£950m) by 2014, up from $800m this year.
But in truth it is still unclear how big the appetite for head-worn computers will be, and Google's Project Glass - which benefits from the company's strong tech brand - is expected to play a critical role in determining whether the sector succeeds or fails.
Vibrating audio
The Federal Communications Commission papers - which were first - offer a few new hints about how the search giant's gadget will work.
They describe data being sent to the small screen display via wi-fi and Bluetooth using a radio unit manufactured by Broadcom.
The equipment is also said to be able to store video files internally and can be recharged by plugging a power connector into the computing unit on the right-hand arm of the glasses' frame.
However, the most arresting detail is the suggestion that audio is provided without the user needing to wear headphones which might disturb how they hear ambient sounds.
Last week Google entitled Wearable Computing Device with Indirect Bone-Conduction Speaker.
It described how an element on the frame could be made to vibrate in order to send sound to a user's inner ear via their skull.
The roots of the innovation date back at least to the 18th Century, when the composer Ludwig van Beethoven - who suffered from hearing loss - listened to his compositions by placing a rod between his piano and his head to transmit the vibrations.
The technology was later developed to help the military monitor communications at the same time as being aware of surrounding noise. It has also been used by some hearing aids, and headsets designed for swimmers and cyclists.
Reviews suggest the audio quality offered can be decent but not as good as traditional headphones.
A spokeswoman for Google said she was unable to provide further comment at this time, but more details may emerge at the firm's I/O conference in May.
Last year Google's co-founder Sergey Brin made headlines by showing off prototypes in a stunt that saw skydivers provide live video pictures from the devices as they plummeted towards the developers forum.
Concept videos have also shown the device bringing up maps, weather reports and video chats - but it is not clear whether this will be possible with the first generation.
- Published9 January 2013
- Published23 November 2012
- Published14 November 2012
- Published28 June 2012