University of Oulu

M. Suomalainen et al., "Unwinding Rotations Improves User Comfort with Immersive Telepresence Robots," 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Sapporo, Japan, 2022, pp. 511-520, doi: 10.1109/HRI53351.2022.9889388

Unwinding rotations improves user comfort with immersive telepresence robots

Saved in:
Author: Suomalainen, Markku1; Sakcak, Basak1; Widagdo, Adhi1;
Organizations: 1Faculty of Information Technology and Electrical Engineering, Center of Ubiquitous Computing, University of Oulu, Oulu, Finland
Format: article
Version: accepted version
Access: open
Online Access: PDF Full Text (PDF, 4.4 MB)
Persistent link:
Language: English
Published: Institute of Electrical and Electronic Engineers, 2022
Publish Date: 2023-03-23


We propose unwinding the rotations experienced by the user of an immersive telepresence robot to improve comfort and reduce VR sickness of the user. By immersive telepresence we refer to a situation where a 360° camera on top of a mobile robot is streaming video and audio into a head-mounted display worn by a remote user possibly far away. Thus, it enables the user to be present at the robot’s location, look around by turning the head and communicate with people near the robot. By unwinding the rotations of the camera frame, the user’s viewpoint is not changed when the robot rotates. The user can change her viewpoint only by physically rotating in her local setting; as visual rotation without the corresponding vestibular stimulation is a major source of VR sickness, physical rotation by the user is expected to reduce VR sickness. We implemented unwinding the rotations for a simulated robot traversing a virtual environment and ran a user study (N=34) comparing unwinding rotations to user’s viewpoint turning when the robot turns. Our results show that the users found unwound rotations more preferable and comfortable and that it reduced their level of VR sickness. We also present further results about the users’ path integration capabilities, viewing directions, and subjective observations of the robot’s speed and distances to simulated people and objects.

see all

Series: ACM/IEEE International Conference on Human-Robot Interaction
ISSN: 2167-2121
ISSN-E: 2167-2148
ISSN-L: 2167-2148
ISBN: 978-1-6654-0731-1
ISBN Print: 978-1-6654-0732-8
Pages: 511 - 520
DOI: 10.1109/hri53351.2022.9889388
Host publication: 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI)
Conference: ACM/IEEE International Conference on Human-Robot Interaction
Type of Publication: A4 Article in conference proceedings
Field of Science: 213 Electronic, automation and communications engineering, electronics
Funding: This work was supported by Business Finland (project HUMOR 3656/31/2019); Academy of Finland (projects PERCEPT 322637, CHiMP 342556); and the European Research Council (project ILLUSIVE 101020977)
EU Grant Number: (101020977) ILLUSIVE - Foundations of Perception Engineering
Academy of Finland Grant Number: 322637
Detailed Information: 322637 (Academy of Finland Funding decision)
342556 (Academy of Finland Funding decision)
Copyright information: © 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.