In collaborative tasks, it is often important for users to understand their collaborator’s gaze direction or gaze target. Using an augmented reality (AR) display, a ray representing the collaborator’s gaze can be used to convey such information. In wide-area AR, however, a simplistic virtual ray may be ambiguous at large distances, due to the lack of occlusion cues when a model of the environment is unavailable. We describe two novel visualization techniques designed to improve gaze ray effectiveness by facilitating visual matching between rays and target and by providing spatial cues to help users understand ray orientation.

With Double Ray, a distant collaborator is asked to cast two rays pointing at two different geometric features of the target object simultaneously. The insight is that by increasing the number of rays aiming at different parts of the target, the visual matching condition is more strict than the single ray case.

In the Parallel Bars technique, based on the orientation of the
gaze ray, we create multiple virtual bars that are all parallel to the single gaze ray. In this way, the user can both directly perceive the orientation of the gaze ray and estimate the distance to the gaze ray.

Publication:

  • Li, Y., Lu, F., Lages, W. S., & Bowman, D. (2019, October). Gaze Direction Visualization Techniques for Collaborative Wide-Area Model-Free Augmented Reality. In Symposium on Spatial User Interaction (p. 11). ACM. https://doi.org/10.1145/3357251.3357583

Info:

  • Team: Yuan Li, Feiyu Li, Wallace Lages, and Doug Bowman
  • Sponsored by: Office of Naval Research