SABER: Spatial Attention, Brain, Extended Reality
SABER: Spatial Attention, Brain, Extended Reality
Tom Bullock, Emily Machniak, You-Jin Kim, Radha Kumaran, Justin Kasowski, Apurv Varshney, Julia Ram, Melissa M. Hernandez, Stina Johansson, Neil M. Dundon, Tobias Höllerer, Barry Giesbrecht:
SABER: Spatial Attention, Brain, Extended Reality. In: IEEE Conference on Virtual Reality and 3D User Interfaces, IEEE VR 2026
Presented: IEEE VR 2026 - [Brain Sensing and Stimulation Session] March 25, 2026, Daegu, Korea
▪️ DOI: 10.1109/VR67842.2026.00042 ▪️ arXiv: 10.48550/arXiv.2603.24830
Abstract
Tracking moving objects is a critical skill for many everyday tasks, such as crossing a busy street, driving a car or catching a ball. Attention is a key cognitive function that supports object tracking; however, our understanding of the brain mechanisms that support attention is almost exclusively based on evidence from tasks that present stable objects at fixed locations. Accounts of multiple object tracking are also limited because they are largely based on behavioral data alone and involve tracking objects in a 2D plane. Consequently, the neural mechanisms that enable moment-by-moment tracking of goal-relevant objects remain poorly understood. To address this knowledge gap, we developed SABER (Spatial Attention, Brain, Extended Reality), a new framework for studying the behavioral and neural dynamics of attention to objects moving in 3D. Participants (n=32) completed variants of a task inspired by the popular virtual reality (VR) game Beat Saber, where they used virtual sabers to strike stationary and moving color-defined target spheres while we recorded electroencephalography (EEG). We first established that standard univariate EEG metrics which are typically used to study spatial attention to static objects presented on 2D screens, can generalize effectively to an immersive VR context involving both static and dynamic 3D stimuli. We then used a computational modeling approach to reconstruct moment-by-moment attention to the locations of stationary and moving objects from oscillatory brain activity, demonstrating the feasibility of precisely tracking attention in a 3D space. These results validate SABER, and provide a foundation for future research that is critical not only for understanding how attention works in the physical world, but is also directly relevant to the development of better VR applications. The insights gained here can potentially inform the design of more intuitive interfaces, effective training simulations, and immersive experiences optimized for the human attention system.
Research Contributions
Univariate EEG metrics used for 2D spatial attention generalize effectively to immersive, dynamic 3D virtual reality environments.
Alpha oscillations (8–13 Hz) accurately track the moment-by-moment 3D location and trajectory of moving, goal-relevant objects.
Inverted Encoding Models successfully reconstruct spatially selective neural responses to objects moving through expansive 3D virtual space.
Attentional orientation and alpha lateralization are significantly delayed when targets are surrounded by distractors versus appearing alone.
Looming distractors capture attention near interception points, reducing the spatial selectivity of the brain’s representation of the target.
EEG data remain robust against noise artifacts in physically active tasks, including motion and sweating, while participants are standing.
The framework demonstrates that the brain’s alpha-band activity continuously updates priority maps during naturalistic attentional pursuit in 3D space.
Citation IEEE Format
[1] T. Bullock, E. Machniak, Y-J. Kim, R. Kumaran, J. Kasowski, A. Varshney, J. Ram, M. M. Hernandez, S. Johansson, N. M. Dundon, T. Höllerer, and B. Giesbrecht, "SABER: Spatial Attention, Brain, Extended Reality," in Proc. 2026 IEEE Conf. Virtual Reality 3D User Interfaces (VR), 2026, pp. 195–205. doi: 10.1109/VR67842.2026.00042. (11 Pages)
Citation APA Format
Bullock, T., Machniak, E., Kim, Y-J., Kumaran, R., Kasowski, J., Varshney, A., Ram, J., Hernandez, M. M., Johansson, S., Dundon, N. M., Höllerer, T., & Giesbrecht, B. (2026). SABER: Spatial Attention, Brain, Extended Reality. 2026 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 195–205. https://doi.org/10.1109/VR67842.2026.00042. (11 Pages)
BibTeX
@INPROCEEDINGS{11457531,
author={Bullock, Tom and Machniak, Emily and Kim, You-Jin and Kumaran, Radha and Kasowski, Justin and Varshney, Apurv and Ram, Julia and Hernandez, Melissa M. and Johansson, Stina and Dundon, Neil M. and Höllerer, Tobias and Giesbrecht, Barry},
booktitle={2026 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)},
title={SABER: Spatial Attention, Brain, Extended Reality},
year={2026},
volume={},
number={},
pages={195-205},
keywords={Filtering;Oscillators;Filters;Circuits and systems;Contacts;Circuits;Feedback;Integrated circuits;Radio access networks;Regional area networks;Virtual Reality;VR;Attention;EEG;Object Tracking;Computational Models;Brain Oscillations},
doi={10.1109/VR67842.2026.00042}}