Over the last couple of years, a variety of solutions for realizing augmented user interfaces on mobile devices has evolved. Still one problem remains: Regardless of flawless design,augmentations still are only a mere overlay of the camera picture.
Tango (formerly known as „Project Tango“) by the Google ATAP introduces a hardwarecomponent for a next generation of android smartphones, that claims new use cases by the use of additional sensors, multiple cameras and a powerful software library. Tangos keyfeatures: Motion tracking, area learning and depth perception are a perfect fit for augmented reality and enable a new level of immersion.
As the physical environment of the device gets quantifiable with centimeter grade precision, the augmentations can be presented at a real location in reality’s three-dimensional space and truly embed into their surroundings. The manifold new possibilities are illustrated using the use case of indoor navigation. Indoor Positioning with a with a precision improved by multiple magnitudes, as well as Tangos area learning features, offer an innovative alternative to existing indoor navigation solutions and makes their infrastructural preconditions (iBeacon grids, Wi-Fi) obsolete.
The features are not only restricted to mere navigation: Using Firebase as a backend to store the building information model, real time usage statistics can be created and points of interest can be augmented in the application.
The session illustrates these improvements on the example of a classic two-dimensional indoor map navigation and an augmented reality user interface, where - thanks to the tango hardware - routes embed seamlessly into reality.