J. Edward Swan II

Spatial Consistency Perception in Optical and Video See-Through Head-Mounted Augmentations

Alexander Plopski, Kenneth R. Moser, Kiyoshi Kiyokawa, J. Edward Swan II, and Haruo Takemura. Spatial Consistency Perception in Optical and Video See-Through Head-Mounted Augmentations. In Poster Abstracts, IEEE International Conference on Virtual Reality (IEEE VR 2016), pp. 265–266, Mar 2016. DOI: 10.1109/VR.2016.7504755.

Download

[PDF] 

Abstract

Correct spatial alignment is an essential requirement for convincingaugmented reality experiences. Registration error, caused by a varietyof systematic, environmental, and user influences decreases therealism and utility of head mounted display AR applications. Focus isoften given to rigorous calibration and prediction methods seeking toentirely remove misalignment error between virtual and realcontent. Unfortunately, producing perfect registration is often simplynot possible. Our goal is to quantify the sensitivity of users toregistration error in these systems, and identify acceptabilitythresholds at which users can no longer distinguish between thespatial positioning of virtual and real objects. We simulate bothvideo see-through and optical see-through environments using aprojector system and experimentally measure user perception of virtualcontent misalignment. Our results indicate that users are lessperceptive to rotational errors over all and that translationalaccuracy is less important in optical see-through systems than invideo see-through.

BibTeX

@InProceedings{IEEEVR16-scp, 
  author =      {Alexander Plopski and Kenneth R. Moser and Kiyoshi Kiyokawa and 
                 J. Edward {Swan~II} and Haruo Takemura},
  title =       {Spatial Consistency Perception in Optical and Video
                 See-Through Head-Mounted Augmentations}, 
  booktitle =   {Poster Abstracts, IEEE International Conference on Virtual Reality 
                 (IEEE VR 2016)}, 
  location =    {Clemson, South Carolina, USA}, 
  date =        {March 19--23}, 
  month =       {Mar}, 
  year =        2016, 
  pages =       {265--266}, 
  note =        {DOI: <a target="_blank"
                 href="https://doi.org/10.1109/VR.2016.7504755">10.1109/VR.2016.7504755</a>.}, 
  abstract =    { 
Correct spatial alignment is an essential requirement for convincing
augmented reality experiences. Registration error, caused by a variety
of systematic, environmental, and user influences decreases the
realism and utility of head mounted display AR applications. Focus is
often given to rigorous calibration and prediction methods seeking to
entirely remove misalignment error between virtual and real
content. Unfortunately, producing perfect registration is often simply
not possible. Our goal is to quantify the sensitivity of users to
registration error in these systems, and identify acceptability
thresholds at which users can no longer distinguish between the
spatial positioning of virtual and real objects. We simulate both
video see-through and optical see-through environments using a
projector system and experimentally measure user perception of virtual
content misalignment. Our results indicate that users are less
perceptive to rotational errors over all and that translational
accuracy is less important in optical see-through systems than in
video see-through.
}, 
}