When Senses Collide: Investigating Modality Congruence and Interference Between Task and Notification in Augmented Reality
Abstract
As augmented reality (AR) technologies become more integrated into everyday tasks, the design of notification systems that minimize disruption while enhancing user awareness is increasingly important. This study investigates how crossmodal interference between notification modality (visual, audio, tactile) and primary task modality (also visual, auditory, tactile) affects user perception and performance in AR environments. Through a controlled user study ($N=36$), participants engaged in modality-specific pattern recognition tasks while responding to spatial directional notifications delivered via different sensory channels. By analyzing notification awareness time, reaction time, task accuracy, and subjective measures, such as cognitive load and user preference, the study reveals how congruency or mismatch between task and notification modalities can either facilitate or hinder attention redirection and multi-tasking efficiency. The findings contribute to the design of adaptive AR notification systems that are less intrusive and better aligned with users' perceptual and cognitive states.