Abstract
We present a pseudo-haptic technique that modulates perceived weight in full-body virtual reality by jointly scaling virtual-hand motion and user viewpoint. Unlike prior study that manipulates the hand alone, our method leverages visual self-motion cues, extending pseudo-haptics beyond seated interactions. A threshold study isolated perceptually salient gains, and a within-subjects lifting experiment ($N=23$) crossed the two factors. Both manipulations significantly modulated perceived weight ratings ($p<.001$), and their effects combined additively. However, presence ratings declined specifically under conditions designed to produce strong lightness illusions-namely, when the virtual hand moved more than its real-world counterpart, while the virtual viewpoint moved less ($p_{\text {Holm }}<.05$). In contrast, when the virtual hand moved more slowly or matched the real hand's motion, manipulating the virtual viewpoint had relatively little impact on presence, which remained stable. Hand scaling thus serves as a low-cost primary cue and viewpoint manipulation as a complementary channel for modulating the perceived weight without sacrificing presence. The technique provides actionable guidance for VR training, rehabilitation, and exergames that demand convincing sensations of physical effort.