We present ChameleonControl, a real-human teleoperation system for scalable remote instruction in hands-on classrooms. In contrast to existing video or AR/VR-based remote hands-on education, ChameleonControl uses a real human as a surrogate of a remote instructor. Building on existing human-based telepresence approaches, we contribute a novel method to teleoperate a human surrogate through synchronized mixed reality hand gestural navigation and verbal communication. By overlaying the remote instructor's virtual hands in the local user's MR view, the remote instructor can guide and control the local user as if they were physically present. This allows the local user/surrogate to synchronize their hand movements and gestures with the remote instructor, effectively teleoperating a real human. We deploy and evaluate our system in classrooms of physiotherapy training, as well as other application domains such as mechanical assembly, sign language and cooking lessons. The study results confirm that our approach can increase engagement and the sense of co-presence, showing potential for the future of remote hands-on classrooms.
Mehrad Faridan, Bheesha Kumari, and Ryo Suzuki. 2023. ChameleonControl: Teleoperating Real Human Surrogates through Mixed Reality Gestural Guidance for Remote Hands-on Classrooms. In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems (CHI '23). ACM, New York, NY, USA, .
DOI: https://doi.org/10.1145/3544548.3581449
coming soon