A Virtual Reality Training Environment for Myoelectric Prosthesis Grasp Control with Sensory Feedback

Abstract

Upper limb myoelectric prosthesis control is difficult to learn. Virtual reality has seen increased deployment in recent years for prosthesis training because it is repeatable, engaging, and can be implemented in the home. While most virtual reality prosthesis simulators do not challenge grasp function, this paper presents the Virtual Prosthesis Emulator (ViPEr), a virtual reality environment for prosthesis grasp control with sensory feedback. For sensory feedback in ViPEr, we have derived data-driven transfer functions that best approximate the applied force from a physical prosthesis and integrated them into a sensory feedback system. This system allows us to relay the interaction force using mechanotactile tactors and recreate realistic interactions, including objects’ specific lift, crush, and deformation characteristics. We will use ViPEr in an upcoming study to evaluate the skill transfer to physical prosthesis performance and the effect of providing sensory feedback in virtual reality training.

Publication
Myoelectric Controls Symposium (MEC24), 12–15 August, Fredericton, NB, Canada, pp. 1–4
Patrick M. Pilarski
Patrick M. Pilarski
Ph.D., ICD.D, Canada CIFAR AI Chair & Professor of Medicine

BLINC Lab, University of Alberta.