A tutorial: Analyzing eye and head movements in virtual reality.

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Additional Information
    • Source:
      Publisher: Springer Country of Publication: United States NLM ID: 101244316 Publication Model: Print-Electronic Cited Medium: Internet ISSN: 1554-3528 (Electronic) Linking ISSN: 1554351X NLM ISO Abbreviation: Behav Res Methods Subsets: MEDLINE
    • Publication Information:
      Publication: 2010- : New York : Springer
      Original Publication: Austin, Tex. : Psychonomic Society, c2005-
    • Subject Terms:
    • Abstract:
      This tutorial provides instruction on how to use the eye tracking technology built into virtual reality (VR) headsets, emphasizing the analysis of head and eye movement data when an observer is situated in the center of an omnidirectional environment. We begin with a brief description of how VR eye movement research differs from previous forms of eye movement research, as well as identifying some outstanding gaps in the current literature. We then introduce the basic methodology used to collect VR eye movement data both in general and with regard to the specific data that we collected to illustrate different analytical approaches. We continue with an introduction of the foundational ideas regarding data analysis in VR, including frames of reference, how to map eye and head position, and event detection. In the next part, we introduce core head and eye data analyses focusing on determining where the head and eyes are directed. We then expand on what has been presented, introducing several novel spatial, spatio-temporal, and temporal head-eye data analysis techniques. We conclude with a reflection on what has been presented, and how the techniques introduced in this tutorial provide the scaffolding for extensions to more complex and dynamic VR environments.
      (© 2024. The Psychonomic Society, Inc.)
    • References:
      360cities (n.d.) 360cities. Retrieved October 10, 2023, from https://360cities.net.
      Andersson, R., Larsson, L., Holmqvist, K., Stridh, M., & Nyström, M. (2017). One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms. Behavior Research Methods, 49, 616–637. https://doi.org/10.3758/s13428-016-0738-9. (PMID: 10.3758/s13428-016-0738-927193160)
      Anderson, N. C., Bischof, W. F., Foulsham, T., & Kingstone, A. (2020). Turning the (virtual) world around: Patterns in saccade direction vary with picture orientation and shape in virtual reality. Journal of Vision, 20(8), 1–19. https://doi.org/10.1167/jov.20.8.21. (PMID: 10.1167/jov.20.8.21)
      Anderson, N. C., Bischof, W. F., & Kingstone, A. (2023). Eye Tracking in Virtual Reality. In C. Maymon, G. Grimshaw, & Y. C. Wu (Eds.), Virtual Reality in Behavioral Neuroscience: New Insights and Methods. Springer, UK: Current Topics in Behavioral Neurosciences. https://doi.org/10.1007/7854_2022_409. (PMID: 10.1007/7854_2022_409)
      Backhaus, D., Engbert, R., Rothkegel, L. O. M., & Trukenbrod, H. A. (2020). Task-dependence in scene perception: Head unrestrained viewing using mobile eye-tracking. Journal of Vision, 20(5), 3–3. https://doi.org/10.1167/jov.20.5.3. (PMID: 10.1167/jov.20.5.3323922867409614)
      Barnes, G. R. (1979). Head-eye coordination in normals and in patients with vestibular disorders. Advances in Oto-Rhino-Laryngology, 25, 197–201. https://doi.org/10.1159/000402942. (PMID: 10.1159/000402942314741)
      Batschelet, E. (1981). Circular statistics in biology. Academic Press.
      Bischof, W. F., Anderson, N. C., & Kingstone, A. (2019). Temporal Methods for Eye Movement Analysis. In C. Klein & U. Ettinger (Eds.), Eye Movement Research: An Introduction to its Scientific Foundations and Applications (pp. 407–448). Springer. https://doi.org/10.1007/978-3-030-20085-5_10. (PMID: 10.1007/978-3-030-20085-5_10)
      Bischof, W. F., Anderson, N. C., Doswell, M. T., & Kingstone, A. (2020). Visual exploration of omni-directional panoramic scenes. Journal of Vision, 20(7), 1–29. https://doi.org/10.1167/jov.20.7.23. (PMID: 10.1167/jov.20.7.23)
      Bischof, W. F., Anderson, N. C., & Kingstone, A. (2023). Eye and head movements while encoding and recognizing panoramic scenes in virtual reality. PLoS ONE, 18(2), e0282030. https://doi.org/10.1371/journal.pone.0282030. (PMID: 10.1371/journal.pone.0282030368003989937482)
      Bizzi, E., Kalil, R. E., & Tagliasco, V. (1971). Eye-head coordination in monkeys: Evidence for centrally patterned organization. Science, 173, 452–454. (PMID: 10.1126/science.173.3995.45217770450)
      Boger, Y. (2017). Understanding Pixel Density & Retinal Resolution, and Why It’s Important for AR/VR Headsets. Retrieved October 10, 2023, from https://www.roadtovr.com/understanding-pixel-density-retinal-resolution-and-why-its-important-for-vr-and-ar-headsets.
      Bourke, P. (2020). Converting to/from cubemaps. http://www.paulbourke.net/panorama/cubemaps.
      Carpenter, R. H. S. (1988). Movements of the Eyes (2nd ed.). Pion Limited.
      Chapel, M.-N., & Bouwmans, T. (2020). Moving objects detection with a moving camera: A comprehensive review. Computer Science Reviews, 38, 100310. https://doi.org/10.1016/j.cosrev.2020.100310. (PMID: 10.1016/j.cosrev.2020.100310)
      Clay, V., König, P. & König, S. (2019). Eye tracking in virtual reality. Journal of Eye Movement Research, 12(1):3. https://doi.org/10.16910/jemr.12.1.3.
      Dar, A. H., Wagner, A. S., & Hanke, M. (2021). REMoDNaV: Robust eye-movement classification. Behavior Research Methods, 53, 399–414. https://doi.org/10.3758/s13428-020-01428-x. (PMID: 10.3758/s13428-020-01428-x32710238)
      David, E. J., Beitner, J., & Võ, M.L.-H. (2021). The importance of peripheral vision when searching 3D real-world scenes: A gaze-contingent study in virtual reality. Journal of Vision, 21(7), 3–3. https://doi.org/10.1167/jov.21.7.3. (PMID: 10.1167/jov.21.7.3342514338287039)
      David, E. J., Lebranchu, P., Da Silva, M. P., & Le Callet, P. (2022). What are the visuo-motor tendencies of omnidirectional scene free-viewing in virtual reality? Journal of Vision, 22(12). https://doi.org/10.1167/jov.22.4.12.
      Doshi, A., & Trivedi, M. M. (2012). Head and eye dynamics during visual attention shifts in complex environments. Journal of Vision, 12(2), 1–16. https://doi.org/10.1167/12.2.9. (PMID: 10.1167/12.2.9)
      Einhäuser, W., Moeller, G. U., Schumann, F., Conradt, J., Vockeroth, J., Bartl, K., Schneider, E., & König, P. (2009). Eye-head coordination during free exploration in human and cat. Annals of the New York Academy of Sciences, 1164, 353–366. https://doi.org/10.1111/j.1749-6632.2008.03709.x. (PMID: 10.1111/j.1749-6632.2008.03709.x19645927)
      Equirectangular Projection. (n.d.). In Wikipedia. Retrieved December 23, 2019, from https://en.wikipedia.org/wiki/Equirectangular_projection.
      Fisher, N. I., Lewis, T., & Embleton, B. J. J. (2010). Statistical Analysis of spherical data. Cambridge University Press. https://doi.org/10.1017/CBO9780511623059. (PMID: 10.1017/CBO9780511623059)
      Foulsham, T., & Kingstone, A. (2017). Are fixations in static natural scenes a useful predictor of attention in the real world? Canadian Journal of Experimental Psychology / Revue canadienne de psychologie expérimentale, 71(2), 172–181. https://doi.org/10.1037/cep0000125. (PMID: 10.1037/cep000012528604053)
      Foulsham, T., Walker, E., & Kingstone, A. (2011). The where, what and when of gaze allocation in the lab and the natural environment. Vision Research, 51(17), 1920–1931. https://doi.org/10.1016/j.visres.2011.07.002. (PMID: 10.1016/j.visres.2011.07.00221784095)
      Freedman, E. G. (2008). Coordination of the eyes and head during visual orienting. Experimental Brain Research, 190, 369–387. https://doi.org/10.1007/s00221-008-1504-8. (PMID: 10.1007/s00221-008-1504-8187043872605952)
      Freedman, E. G., & Sparks, D. L. (1997). Eye–head coordination during head-unrestrained gaze shifts in rhesus monkeys. Journal of Neurophysiology, 77(5), 2328–2348. https://doi.org/10.1152/jn.1997.77.5.2328. (PMID: 10.1152/jn.1997.77.5.23289163361)
      Gilchrist, I. D., Brown, V., Findlay, J. M., & Clarke, M. P. (1998). Using the eye-movement system to control the head. Proceedings of the Royal Society of London B, 265, 1831–1836. https://doi.org/10.1098/rspb.1998.0509. (PMID: 10.1098/rspb.1998.0509)
      Goldman, R. (2022). Rethinking Quaternions. Springer. https://doi.org/10.1007/978-3-031-79549-7. (PMID: 10.1007/978-3-031-79549-79928809)
      Grafarend, E. W., You, R.-J., & Syffus, R. (2014). Map Projections (2nd ed.). Springer. https://doi.org/10.1007/978-3-642-36494-5. (PMID: 10.1007/978-3-642-36494-5)
      Greene, N. (1986). Environment mapping and other applications of world projections. IEEE Computer Graphics and Applications, 6, 21–29.
      Hartley, R. & Zisserman, A. (2004). Multiple View Geometry in Computer Vision (2nd ed.). Cambridge University Press. https://doi.org/10.1017/CBO9780511811685.
      Hessels, R. S., Niehorster, D. C., Nyström, M., Andersson, R., & Hooge, I. T. C. (2018). Is the eye-movement field confused about fixations and saccades? A survey among 124 researchers. Royal Society Open Science, 5, 180502. https://doi.org/10.1098/rsos.180502. (PMID: 10.1098/rsos.180502302250416124022)
      Holmqvist, K., & Andersson, R. (2017). Eye tracking: A comprehensive Guide to Methods. CreateSpace Independent Publishing Platform.
      Hooge, I. T., Niehorster, D. C., Nyström, M., Andersson, R., & Hessels, R. S. (2018). Is human classification by experienced untrained observers a gold standard in fixation detection? Behavior Research Methods, 50, 1864–1881. https://doi.org/10.3758/s13428-016-0738-9. (PMID: 10.3758/s13428-016-0738-929052166)
      Hooge, I., Hessels, R. S., Niehorster, D. C., Diaz, G. J., Duchowski, A. T., & Pelz, J. B. (2019). From lab-based studies to eye-tracking in virtual and real worlds: Conceptual and methodological problems and solutions. Symposium 4 at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 20.8.2019. Journal of Eye Movement Research, 12(7), https://doi.org/10.16910/jemr.12.7.8 . https://doi.org/10.16910/jemr.12.7.8.
      Jacobs, O., Anderson, N. C., Bischof, W. F., & Kingstone, A. (2020). Into the unknown: Head-based selection is less dependent on peripheral information than gaze-based selection in 360-degree virtual reality scenes. PsyArXiv. https://doi.org/10.31234/osf.io/2qtcw.
      Jeong, J.-B., Lee, S., Ryu, I.-W., Le, T. T., & Ryu, E.-S. (2020). Towards Viewport-dependent 6DoF 360 Video Tiled Streaming for Virtual Reality Systems. In: MM '20: Proceedings of the 28th ACM International Conference on Multimedia, (pp. 3687–3695). https://doi.org/10.1145/3394171.3413712.
      Kangas, J., Špakov, O., Raisamo, R., Koskinen, O., Järvenpää, T., & Salmimaa, M. (2022). Head and gaze orienting in hemispheric image viewing. Frontiers in Virtual Reality, 3, 822189. https://doi.org/10.3389/frvir.2022.822189. (PMID: 10.3389/frvir.2022.822189)
      Kingstone, A., Smilek, D., & Eastwood, J. D. (2008). Cognitive ethology: A new approach for studying human cognition. British Journal of Psychology, 99(3), 317–340. https://doi.org/10.1348/000712607X251243. (PMID: 10.1348/000712607X25124317977481)
      Komogortsev, O. V., & Karpov, A. (2013). Automated classification and scoring of smooth pursuit eye movements in the presence of fixations and saccades. Behavior Research Methods, 45(1), 203–215. https://doi.org/10.3758/s13428-012-0234-9. (PMID: 10.3758/s13428-012-0234-922806708)
      Komogortsev, O. V., Gobert, D. V., Jayarathna, S., Koh, D., & Gowda, S. (2010). Standardization of automated analyses of oculomotor fixation and saccadic behaviors. IEEE Transactions on Biomedical Engineering, 57(11), 2635–2645. https://doi.org/10.1109/TBME.2010.2057429. (PMID: 10.1109/TBME.2010.2057429)
      Lambers, M. (2020). Survey of cube mapping methods in interactive computer graphics. The Visual Computer, 36, 1043–1051. https://doi.org/10.1007/s00371-019-01708-4. (PMID: 10.1007/s00371-019-01708-4)
      Land, M. F. (2004). The coordination of rotations of the eyes, head and trunk in saccadic turns produced in natural situations. Experimental Brain Research, 159, 151–160. https://doi.org/10.1007/s00221-004-1951-9. (PMID: 10.1007/s00221-004-1951-915221164)
      Land, M. F., & Hayhoe, M. (2001). In what ways do eye movements contribute to everyday activities? Vision Research, 41(25–26), 3559–3565. https://doi.org/10.1016/S0042-6989(01)00102-X. (PMID: 10.1016/S0042-6989(01)00102-X11718795)
      Land, M. F., & Tatler, B. W. (2009). Looking and acting: Vision and eye movements in natural behaviour. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780198570943.001.0001. (PMID: 10.1093/acprof:oso/9780198570943.001.0001)
      Lang, B. (2018). Understanding the difference between ‘Screen Door Effect’, ‘Mura’, & ‘Aliasing’. Retrieved October 10, 2023, from https://www.roadtovr.com/whats-the-difference-between-screen-door-effect-sde-mura-aliasing-vr-headset.
      Lange, F. (2019). HMD-Eyes. GitHub Repository, https://github.com/pupil-labs/hmd-eyes.
      Lapaine, M., & Usery, E. L. (2017). Choosing a map projection. Lecture notes in geoinformation and cartography. Springer. https://doi.org/10.1007/978-3-319-51835-0. (PMID: 10.1007/978-3-319-51835-0)
      Laurutis, V., & Robinson, D. (1986). The vestibulo- ocular reflex during human saccadic eye movements. Journal of Physiology, 373, 209–33. https://doi.org/10.1113/jphysiol.1986.sp016043. (PMID: 10.1113/jphysiol.1986.sp01604334890911182533)
      Lee, W. J., Kim, J. H., Shin, Y. U., Hwang, S., & Lim, H. W. (2019). Differences in eye movement range based on age and gaze direction. Eye, 33, 1145–1151. https://doi.org/10.1038/s41433-019-0376-4. (PMID: 10.1038/s41433-019-0376-4308377106707237)
      Leigh, R. J., & Zee, D. S. (2015). The Neurology of Eye Movements (5th ed.). Oxford Academic. https://doi.org/10.1093/med/9780199969289.001.0001. (PMID: 10.1093/med/9780199969289.001.0001)
      Mardia, K. V., & Jupp, P. E. (2000). Directional Statistics. Wiley. ISBN: 978-0-471-95333-3.
      Mehrotra, A., Silver, C., Jacobs, O., Bischof, W. F., & Kingstone, A. (2024). Sit, Stand, or Swivel? Posture Affects Visual Exploration of Panoramic Scenes in Virtual Reality [Manuscript in preparation]. University of British Columbia.
      Niehorster, D. C., Li, L., & Lappe, M. (2017). The accuracy and precision of position and orientation tracking in the HTC Vive virtual reality system for scientific research. I-Perception, 8(3), 1–23. https://doi.org/10.1177/2041669517708205. (PMID: 10.1177/2041669517708205)
      Nyström, M., & Holmqvist, K. (2010). An adaptive algorithm for fixation, saccade and glissade detection in eyetracking data. Behavior Research Methods, 42, 188–204. https://doi.org/10.3758/BRM.42.1.188. (PMID: 10.3758/BRM.42.1.18820160299)
      Ozioko, O., & Dahiya, R. (2022). Smart tactile gloves for haptic interaction, communication, and rehabilitation. Advanced Intelligent Systems, 4, 2100091. https://doi.org/10.1002/aisy.202100091. (PMID: 10.1002/aisy.202100091)
      Pelz, J., Hayhoe, M., & Loeber, R. (2001). The coordination of eye, head, and hand movements in a natural task. Experimental Brain Research, 139(3), 266–277. https://doi.org/10.1007/s002210100745. (PMID: 10.1007/s00221010074511545465)
      Quaternions and spatial rotation. (n.d.) In Wikipedia. Retrieved October 10, 2023, from https://en.wikipedia.org/wiki/Quaternions_and_spatial_rotation.
      Risko, E. F., Richardson, D. C., & Kingstone, A. (2016). Breaking the fourth wall of cognitive science: Real-world social attention and the dual function of eye. Current Directions in Psychological Science, 25(1), 70–74. https://doi.org/10.1177/0963721415617806. (PMID: 10.1177/0963721415617806)
      Rizzo, A. S., Goodwin, G. J., De Vito, A. N., & Bell, J. D. (2021). Recent advances in virtual reality and psychology: Introduction to the special issue. Translational Issues in Psychological Science, 7(3), 213–217. https://doi.org/10.1037/tps0000316. (PMID: 10.1037/tps0000316)
      Rötth, A. (1925). Über das praktische Blickfeld [On the practical field of fixation]. Graefe’s Archive for Clinical and Experimental Ophthalmology, 115(2), 314–321. (PMID: 10.1007/BF02283709)
      Salvucci, D. D., & Goldberg, J. H. (2000). Identifying fixations and saccades in eye-tracking protocols. Proceedings of the Eye-Tracking Research and Applications Symposium (pp. 71–78). ACM Press. https://doi.org/10.1145/355017.355028. (PMID: 10.1145/355017.355028)
      Sidenmark, L, & Gellersen, H. (2019). Eye, head and torso coordination during eye shifts in virtual reality. ACM Transaction on Computer–Human Interaction, 27(1), 4:1–4:40. https://doi.org/10.1145/3361218.
      Sitzmann, V., Serrano, A., Pavel, A., Agrawala, M., Gutiérrez, D., Masia, B., & Wetzstein, G. (2018). Saliency in VR: How do people explore virtual environments? IEEE Transactions on Visualization and Computer Graphics, 24(4), 1633–1642. https://doi.org/10.1109/TVCG.2018.2793599. (PMID: 10.1109/TVCG.2018.279359929553930)
      Solman, G. J., & Kingstone, A. (2014). Balancing energetic and cognitive resources: Memory use during search depends on the orienting effector. Cognition, 132(3), 443–454. https://doi.org/10.1016/j.cognition.2014.05.005. (PMID: 10.1016/j.cognition.2014.05.00524946208)
      Solman, G. J., Foulsham, T., & Kingstone, A. (2017). Eye and head movements are complementary in visual selection. Royal Society Open Science, 4, 160569. https://doi.org/10.1098/rsos.160569. (PMID: 10.1098/rsos.16056928280554)
      ‘t Hart, B. M., Vockeroth, J., Schumann, F., Bartl, K., Schneider, E., König, P., & Einhäuser, W. (2009). Eye allocation in natural stimuli: Comparing free exploration to head-head-fixed condition viewing conditions. Visual Cognition, 17(6–7), 1132–1158. https://doi.org/10.1080/13506280902812304. (PMID: 10.1080/13506280902812304)
      The MathWorks Inc. (2023). MATLAB version: 9.14.0 (R2023a). Natick, Massachusetts: The MathWorks Inc. https://www.mathworks.com.
      Torralba, A., Oliva, A., Castelhano, M. S., & Henderson, J. M. (2006). Contextual guidance of eye movements and attention in real-world scenes: The role of global features in object search. Psychological Review, 113(4), 766–786. https://doi.org/10.1037/0033-295X.113.4.766. (PMID: 10.1037/0033-295X.113.4.76617014302)
      Unity Technologies. (2017). Unity Software: Release 2017.4.1. Retrieved September 15, 2017, from https://unity3d.com.
      Valenti, R., Sebe, N., & Gevers, T. (2012). Combining head pose and eye location information for gaze estimation. IEEE Transactions on Image Processing, 21(2), 802–815. https://doi.org/10.1109/TIP.2011.2162740. (PMID: 10.1109/TIP.2011.216274021788191)
      Vince, J. (2021). Quaternions for Computer Graphics. Springer. https://doi.org/10.1007/978-1-4471-7509-4. (PMID: 10.1007/978-1-4471-7509-4)
      von Noorden, G. K., & Campos, E. C. (2002). Binocular vision and ocular motility: Theory and management of strabismus (vol. 6). Mosby.
      Xiao, J., Hays, J., Ehinger, K. A., Oliva, A., & Torralba, A. (2010). Sun database: Large-scale scene recognition from abbey to zoo. IEEE Conference on Computer Vision and Pattern Recognition (CVPR), (pp. 3485–3492). IEEE http://ieeexplore.ieee.org/abstract/document/5539970/.
      Zangemeister, W. H., & Stark, L. (1982). Types of eye movements: Variable interactions of eye and head movements. Experimental Neurology, 77, 563–577. (PMID: 10.1016/0014-4886(82)90228-X7117463)
      Zhan, T., Zou, J., Xiong, J., Chen, H., Liu, S., Dong, Y., & Wu, S.-T. (2020). Planar optics enables chromatic aberration correction in immersive near-eye displays. In: B. C. Kress, & C. Peroz (Eds.), Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR), Proceedings SPIE, 11310 (p 1131003). https://doi.org/10.1117/12.2542365.
      Zhao, C., Kim, A. S., Beams, R., & Badano, A. (2022). Spatiotemporal image quality of virtual reality head mounted displays. Scientific Reports, 12, 20235. https://doi.org/10.1038/s41598-022-24345-9. (PMID: 10.1038/s41598-022-24345-9364244349691731)
    • Grant Information:
      AK: RGPIN-2022-03079 Natural Sciences and Engineering Research Council of Canada; NCA: Postdoctoral Fellowship Natural Sciences and Engineering Research Council of Canada
    • Contributed Indexing:
      Keywords: Eye movements; Head movements; Head–eyes relationship; Spatial analysis; Temporal analysis; Virtual reality
    • Publication Date:
      Date Created: 20240808 Date Completed: 20241031 Latest Revision: 20241031
    • Publication Date:
      20241031
    • Accession Number:
      10.3758/s13428-024-02482-5
    • Accession Number:
      39117987