Real-time Augmented Reality has often been cited as a weakness of Google Glass, with many commentators claiming that Google Glass is ill-suited for AR. However, +Brandyn White and +Andrew Miller of OpenGlass have achieved just that. Using their technology, you can overlay images from Google Glass with additional information, such as the ratings for a restaurant, right in front of you. This was demonstrated before by OpenGlass as a proof of concept by using still images, but has now been achieved in real time.
Interactive Augmented Reality is achieved by combining images taken from a camera with a video output from Glass. The camera image is then modified based on user input and sent back to Glass for display using the Mirror API. By using the Mirror API, OpenGlass aims to remain within the Google Terms of Service and avoids rooting of the device.
This can be used, for example, to play Tick-tac-toe with a remote friend who has a similar tic-tac-toe board in front of him, but is connected to you online, as is demonstrated in the video. In the video, there is a minor delay of 4 seconds due to image processing done on a server as well as internal Google Glass delays. OpenGlass has brought this delay down to 3 seconds on a local server and plans to reduce this delay further. Ultimately, your smartphone can act as a server to facilitate Real Time Augmented Reality in Glass.
The OpenGlass open-source library is available at Github for Glass Explorers to play with today. It was created by PhD students Brandyn White and Andrew Miller, who have extensive experience in computer vision and have previously demonstrated an app which uses Google Glass to identify objects for visually challenged people.
This is an impressive achievement for a first generation Augmented Reality app on Google Glass and we believe that the future for Google Glass AR is very rosy indeed.