Hero image

Vision and Haptics Laboratory: Matrix Device

The Challenge

Teleoperation entails a robotic device controlled at a distance by a human operator e.g., due to a remote and/or hazardous environment. Most modern teleoperation setups, whilst "state-of-the-art" in terms of technology remain difficult and unintuitive to use, requiring a large amount of operator training. Even with extensive training some operators are simply unable to master their use. Additionally, when "mastered" by an operator, teleoperation procedures remain slow and  laborious. All of this is in stark contrast with the ease and dexterity of our everyday movements and interactions with the world. The result is a dramatic limitation the promise of remote telepresence and at the same time vastly increasing cost in using current systems.

The Research Team

Project PI's are Assc Prof. Peter Scarfe (School of Psychology, University of Reading) and Prof. William Harwin (School of Biological Sciences, University of Reading). 

Dr Julie Skevik is a Postdoctoral Research on the project and previously did their PhD on medical haptics in the lab. Jake Tomaszewski is a Research Officer on the project and graduated with First Class Honours in Biomedical Engineering from the University of Reading. 

The team based at Reading are part of the wider Robotics and AI in Nuclear (RAIN) Hub. For more information on our team members see our people page.

Our Research

With a human in the control loop, it becomes essential to understand how sensory input to the operator is used to produced goal directed movements, using the telepresence device, to achieve a given goal. Critically, the design of current telepresence systems  rarely takes human sensory perception into consideration with the design of hardware and/or software.

For example: (1) the kinematics of the robot being controlled differ substantially from those of the operator, resulting in transforms between the visual and haptic / proprioceptive workspaces, (2) the operator received multiple disparate video feeds of the movement of the robot, with no stereoscopic depth information. 

The aim of our research in the RAIN Hub is to examine the effects of transforms between the visual and haptic / proprioceptive workspaces on the performance of tasks in a simulated telepresence environment. 

The Matrix Device

To achieve these aims we have built what we call the Matrix Device. The Matrix Device is a one of a kind  immersive virtual reality simulation system, which can simulated both vision, audition and touch. The system comprises of six Vicon Vero 2.2 motion tracking cameras, four custom built haptic devices (3 DOF force feedback and tracking) and an Oculus Rift S virtual reality head mounted display (VRHMD).

In order to study the effects of transforms between the visual and haptic / proprioceptive workspaces on task performance, it is essential that we can accurately align the visual and haptic workspaces. To achieve this goal we use the Vicon motion tracking system, with custom in-house code, to align the visual and haptic workspaces to millimetre precision.

Matrix Device Details

The Matrix Device is composed of four custom-made 3DOF haptic robotic force feedback devices. These provide powerful, high-fidelity bimanual force feedback to the thumb and finger of each hand. Virtual reality visuals are provided by an Oculus Rift S head mounted display. The visual and haptic workspaces are spatially coaligned to millimetre precision using six Vicon Vero 2.2 motion tracking cameras. Click here to download the flyer.

Matrix Device Teaser Video

Use Cases and Future

The Matrix Device was built as part of our funded research with the RAIN Hub. Our focus has been on simulating nuclear teleoperation procedures, such as "pick up and post" and "sort and segregate". With the RAIN Hub extension, we diversified our use cases to include medical haptics. This brings together work already going on in the lab on how the provision of VR and/or haptics could help improve the delineation of tumours in medical imaging data, in conjunction with Dr Alan McWillaims group at Christie Hospital Manchester.

We are current in the process of integrating the Matrix Device with the TOIA software system from Generic Robotics. This will allow us to upgrade both our visual and haptic simulation fidelity. Including the simulation of soft body physics. This will also allow more rapid and easy coding of tasks. 

A PhD studentship will be starting in the autumn to extend the works that we have been doing as part of RAIN. This is co-funded by SeNSS and EUROfusion.

This research is funded by

Product image
Product image
Instagram image Instagram image Instagram image Instagram image Instagram image Instagram image