Are You Empathizing with Me? Exploring External Expressions of Empathy in Interpersonal VR Communication

0citations
0
citations
#33
in ISMAR 2025
of 229 papers
7
Top Authors
2
Data Points

Abstract

Empathy is central to social interaction, yet how it is externally expressed in virtual reality (VR) communication remains underexplored. In this study, we examined how directionality-aware cues of empathy, such as mimicry, eye contact, and body proximity, relate to cognitive and emotional empathy. We designed high- and low-empathy scenarios and recruited participants with acting experience to ensure clear emotional expressions. Our findings indicate that facial mimicry patterns differ by empathy type: cognitive empathy involves subtle, speech-related muscle movements, whereas emotional empathy is associated with more intense affective expressions. Interestingly, we also found that while facial expressions and lower-body mimicry tend to emerge unconsciously, upper-body mimicry occurs more consciously, suggesting distinct pathways of empathic embodiment. We also observed that vocal intensity mimicry and pitch variability serve as important indicators of empathy, and a consistent hand approach is closely linked to empathy. Additionally, emotional empathy fosters longer eye contact, whereas cognitive empathy stabilizes gaze and head movements. Finally, we constructed machine learning models to predict empathy from these external expressions. Our best classifier achieved an accuracy of 0.756 for cognitive empathy and 0.704 for emotional empathy, indicating the feasibility of objective assessment. These findings provide a deeper understanding of how empathy is manifested in VR communication and support the development of empathy-aware virtual agents and training systems.

Citation History

Jan 27, 2026
0
Feb 3, 2026
0