Augmented Reality Anywhere and Anytime
The Handheld Augmented Reality
Participate to ISMAR 2012
Social Augmented Reality (Augmented Reality 2.0)
Augmented Reality (AR) was first demonstrated in the 1960s, but only recently have technologies emerged that can be used to easily deploy AR applications to many users. Camera- equipped cell phones with significant processing power and graphics abilities provide an inexpensive and versatile platform for AR applications, while the social networking technology of Web 2.0 provides a large-scale infrastructure for collaboratively producing and distributing geo-referenced AR content. This combination of widely used mobile hardware and Web 2.0 software allows the development of a new type of AR platform that can be used on a global scale. In this project we defined the term Social AR (also refered as Augmented Reality 2.0) as a concept that allows Augmented Reality to be deployed in a massive scale with a large-scale in number of users as well as working volume. Furthermore we believe that in an social augmented reality environment users can acitvely participate in the content creation process and create or update the AR content at specific locations. The following list of subprojects are first steps towards the realization of an environment following the concept of Social AR/AR 2.0.
Summary: This book chapter introduces and outlines the concept of Augmented Reality 2.0. We show how recent trends from Web 2.0 will emerge into the field of Augmented Reality forming new kind of applications such as Augmented Reality for Social Networking and analyze how the application development and content authoring need to change for this kind of applications.
Publication: Online Creation of Panoramic Augmented Reality Annotations on Mobile Phones, Dieter Schmalstieg, Tobias Langlotz, Mark Billinghurst, Virtual Realities, Dagstuhl seminar proceedings (eds. Sabine Coquillart, Guido Brunnett, Greg Welch), 2010 [PAPER]
Summary: We investigate on the potential of in-situ creation of AR annotation of the environment directly on the mobile phone. Previous authoring or annotation tools were mostly bound to desktop computers, or could operate only at the accuracy of the employed mobile sensors. Our tracking approach allows creating annotations in place and storing them in a self-descriptive way on a server, in order to allow a later re-identification. We use GPS information for efficient indexing, but identify the label positions using template matching against the panoramic map (see our work on panorama mapping and tracking). This approach yields accurate and robust registration of annotations with the environment, even if seen from a slightly different position compared to where the annotation was created. The system can be used in large-scale indoor and outdoor scenarios and offers an accurate mapping of the annotations to physical objects.
Publication: Online Creation of Panoramic Augmented Reality Annotations on Mobile Phones, Tobias Langlotz, Daniel Wagner, Alessandro Mulloni, Dieter Schmalstieg, Accepted for IEEE Pervasive Computing [PAPER]
ROBUST DETECTION AND TRACKING OF ANNOTATIONS (2011)
Summary: We present an enhanced approach for registering and tracking such anchor points, which is suitable for current generation mobile phones and can also successfully deal with the wide variety of viewing conditions encountered in real life outdoor use. The approach is based on on-the-fly generation of panoramic images by sweeping the camera over the scene. The panoramas are then used for stable orientation tracking, while the user is performing only rotational movements. This basic approach is improved by several new techniques for the re-detection and tracking of anchor points. For the re-detection, specifically after temporal variations, we first compute a panoramic image with extended dynamic range, which can better represent varying illumination conditions. The panorama is then searched for known anchor points, while orientation tracking continues uninterrupted. We then use information from an internal orientation sensor to prime an active search scheme for the anchor points, which improves matching results. Finally, global consistency is enhanced by statistical estimation of a global rotation that minimizes the overall position error of anchor points when transforming them from the source panorama in which they were created, to the current view represented by a new panorama. Once the anchor points are redetected, we track the user's movement using a novel 3-degree-of- freedom orientation tracking approach that combines vision tracking with the absolute orientation from inertial and magnetic sensors. We tested our system using an AR campus guide as an example application and provide detailed results for our approach using an off-the-shelf smartphone. Results show that the re-detection rate is improved by a factor of 2 compared to previous work and reaches almost 90% for a wide variety of test cases while still keeping the ability to run at interactive frame rates.
Publication: Robust detection and tracking of annotations for outdoor augmented reality browsing, Tobias Langlotz, Claus Degendorfer, Alessandro Mulloni, Gerhard Schall, Gerhard Reitmayr, Dieter Schmalstieg, Accepted for Computer & Graphics special issue on mobile Augmented Reality 2011 [PAPER]
IN-SITU AUTHORING FOR MOBILE AUGMENTED REALITY (2010)
Summary: We present a novel system allowing in-situ content creation for mobile Augmented Reality in unprepared environments. This system targets smartphones and therefore allows a spontaneous authoring while in place. We describe two different scenarios, which are depending on the size of the working environment and consequently use different tracking techniques. A natural feature-based approach for planar targets is used for small working spaces whereas for larger working environments, such as in outdoor scenarios, a panoramic based orientation tracking is deployed. Both are integrated into one system allowing the user to use the same interaction for creating the content applying a set of simple, yet powerful modeling functions for content creation. The resulting content for Augmented Reality can be shared with other users using a dedicated content server or kept in a private inventory for later use.
Publication: Sketching up the world: In-situ authoring for mobile Augmented Reality, Tobias Langlotz, Stefan Mooslechner, Stefanie Zollmann, Claus Degendorfer, Dieter Schmalstieg, Accepted for Springer Personal and Ubiquitous Computing 2011
copyright (c) 2011 Graz University of Technology