**Dies ist eine alte Version des Dokuments!**

Analysis and Discussion of the Literature Research Results (2019-03-07)

Tagged as: blog, results, ar, taxonomies
Group: G This blog article will show you the most important results, limitations and conclusions of our research.

Last week we finished the analysis of our literature research results and handed in our paper. This means that we finished all five phases of a literature research according to Brocke et al. (2009). The last phase contains the interpretation of our results,identification of lesson learned as well as possible, future activities (Brocke et al., 2009).

This blog article will show you the most important results, limitations and conclusions of our research. We will only give a short overview about the main aspect in these categories. Further information can be retrieved from our paper.

Topic of Publications

Next to the general results, that we already presented in the last blog, we additionally analyzed the main topics of publications. Therefore, approaches from Dey et al. (2016), Billinghurst et al. (2015), Carmigniani et al. (2011), Papagiannakis et al. (2008) and Van Krevelen and Poelman (2010) were used and slightly adapted (cf. table 1). We added the topics Nature for nature-related publications, Application Development for presentations of supporting tools for the development of AR applications, and System and Technology for publications that presented technological algorithms and models. Most publications were assigned to the topic System and Technology (49, 36.3%), followed by Education and Application Development (14, 10.37%) and Health & Medicine (13, 9.63%).

Table 1: Publication Topics adapted from Dey et al. (2016), Billinghurst et al. (2015), Carmigniani et al. (2011), Papagiannakis et al. (2008) and Van Krevelen and Poelman (2010)

Results of Technological Analysis

Classes of AR Displays

Based on our results we added the subcategory Multiple to our technological taxonomy for combinations of multiple display types in one application. Table 2 shows the distribution of classes of AR displays. With 127 (70.17%) of all displays the visual display was the preferred type. Visual displays still represent the most popular display type among AR publications. Olfactory and gustatory displays were not used in any of the analyzed publications. 13.26% of all display usages were combinations of multiple displays or multimodal ones, if different senses were involved (Schmalstieg and Höllerer, 2016).

Table 2: Distribution of AR Displays based on our Taxonomy

Following combinations of displays occurred:

  • monitor-based & optical see-through: 4 (2.21%)
  • monitor-based & monitor-based: 3 (1.66%)
  • monitor-based & video see-through: 2 (1.10%)
  • video see-through & optical see through: 1 (0.55%)
  • spatial & optical see-through: 1 (0.55%)
  • spatial & monitor-based: 1 (0.55%)

Following multimodal displays occurred:

  • video see-through & aural: 3 (1.66%)
  • monitor-based & aural & haptic: 2 (1.10%)
  • optical see-through & aural: 2 (1.10%)
  • spatial & haptic: 1 (0.55%)
  • optical see-through & haptic: 1 (0.55%)
  • closed-view & aural: 1 (0.55%)
  • monitor-based & aural: 1 (0.55%)
  • optical see-through & haptic & monitor: 1 (0.55%)

AR Display Postioning

Head-mounted (44, 33.08%) and handheld (41, 31.3%) displays were the most preferred display types. Figure 2 shows an increase of head-mounted displays since 2015 and a decrease of handheld displays from 2016 to 2017. The least preferred displays were body-attached ones with 3 (2.26%) and near-eye displays with no occurrences.

Figure 2: AR Display Positioning based on our Taxonomy

AR Tracking Methods and Environment

The technological taxonomy was expanded by the sensor types EMG (Electromyography) and EEG (Electroencephalogram) based on our results. Table 3 shows the distribution of tracking methods. Optical tracking is the most preferred method of all tracking systems with 44 (23.40%) occurrences for 3D structured light tracking and 30 (15.96%) occurrences for outside-looking-in tracking in our analysis. GPS is the preferred mobile tracking system (5.32%). Stationary displays including mechanical, electromagnetic and ultrasonic displays were never used in the analyzed publications. Regarding the environment of usage, most applications are utilized indoors (103, 76.29%), in contrast to outdoor environments (10, 7.41%) or both environments (17, 12.59%).

Table 3: Distribution of Tracking Methods based on our Taxonomy

Interfaces

Based on our results the technological taxonomy was adapted by adding Desktop Computer UI, subdivided in Mouse and Keyboard to the class 2D. The class 3D UI was expanded by the subcategories Presenter, Controller, Handle, Stylus and Handheld- Clicker. Hand Motions and Face Recognition were added to Natural UI. Furthermore the category Combinations was divided into the subcategories Multi-View UI and Multiple Interfaces consisting of the subcategories Multiple Classes and Same Class. Finally, the class Adaptive was expanded by the subcategory EMG (Electromyography) Interface.

Figure 3: Types of AR User Interfaces (aggregated from 2015 - 2017) based on our Taxonomy

Touch-based (25, 18.52%) and body-motionbased input modalities (24, 17.78%) were the most preferred ones (see Figure 3). Tangible User Interfaces represented only 10.37% of all displays. Augmented paper and Multi-view UI occurred in no publication.

24 (17.78%) of all interfaces represented combinations of multiple types at once. Due to issues in defining the terms Hybrid UI and Multimodal UI, we decided to replace both with the class Multiple Interfaces, subdivided into Multiple classes and Same Class based on our results. The subcategory Multiple Classes contains interfaces from at least two of the eight main interface classes described in section 3 (10), whereas the subcategory Same class contains interfaces from a single class (7). The combination of touch and body motion (7, 5.18%) occurred most, whereas other combinations occurred once or twice in the analysis.