The Matrix: VR and Haptic Robotics
Shape sorting in the Matrix
In many instances a humans need to interact efficiently with 3D+ data i.e., data with three or more dimensions. Examples include the delineation of cancerous tumours in medical imaging data, 3D models in architecture and construction, data science, and remote telepresence. However, our ability to generate this type of data has largely been outpaced by our ability to interact naturally and efficiently with it.
This is abundantly clear within seconds of using any of today’s commercially available virtual reality devices. In these we inhabit a rich visual and auditory world, but the first thing that we want to do is to reach out and naturally interact with what we see. Currently, this interaction is limited and rudimentary at best.
(1) Holding a handheld controller, reaching to an object and pressing a button to on the controller to “hold” the object, moving the controller, then releasing the button to “release” the object in a different location.
(2) Using marker-less tracking to track and visualise the hands. This feels somewhat more natural, but the user is effectively interacting with thin air as objects have no weight or material properties.
This is where haptics comes in. Haptics refers to the properties related to our sense of touch. To investigate vision and haptics with 3D+ data, my lab has developed what we call the “Matrix”; a custom built, one-of-a-kind, virtual reality and haptic robotic system.
Snooker Player Ronnie O'Sullivan showing the importance of haptics
The Matrix is comprised of four custom built 3DOF haptic robotic force-feedback devices, six Vicon Vero 2.2 motion tracking cameras, and an Oculus Quest 2 virtual reality head mounted display. The visual environment is simulated in Unreal Engine and the haptic environment and physics through the TOIA middleware plugin for Unreal Engine developed by Generic Robotics. TOIA allows high-fidelity haptic simulation, including soft body mechanics.
The Matrix was built as part of our funded research with the RAIN Hub. For this our focus was on simulating nuclear teleoperation procedures, such as "pick up and post" and "sort and segregate". With the RAIN Hub extension, we diversified our use cases to include medical haptics, this bringing together work already going on in the lab on how the provision of VR and/or haptics could help improve the delineation of tumours in medical imaging data, in conjunction with Dr Alan McWillaims group at Christie Hospital Manchester.
A user in the Matrix
TOIA by Generic Robotics is a software tool that allows researchers and creators to effortlessly build interactive experiences with haptics feedback with no prior knowledge of haptics, robotics, physics or programming. TOIA is deeply integrated with Unreal Engine and allows users to build complex haptic content via visual scripting (blueprints), with not a single line of code. TOIA works with all major commercial haptics devices, allowing experiences to be easily redeployed to different haptics hardware. The lab has worked closely with Generic Robotics to integrate TOIA and the Matrix.
Soft-body mechanics in TOIA
The Matrix Team
Associate. Prof. Peter Scarfe: Lead of the Vision and Haptics Laboratory, School of Psychology, University of Reading.
Dr. Alastair Barrow: CEO of Generic Robotics, honorary member of the Vision and Haptics Lab and research collaborator
Prof. William Harwin: Professor of Interactive and Human Systems, Biomedical Engineering, University of Reading.
Dr. Alan McWilliam: University of Manchester / Christie Hospital Manchester
Jake Tomaszewski: Research officer working on the Matrix over the summer of 2022 prior to starting a MSc at Imperial.