Research Roundup (#14)
.png/:/rs=w:1440,h:1440)
Welcome...
Welcome to the fourteenth-ever Research Roundup! A weekly catch-up on the latest developments in the field of XR research.
It’s been a busy week, so let's not hang around.
The Week in 3 (Sentences)
New research on social interaction in VR explored the use of live captioning in social VR with SpeechCaps, RemoteChess was found to increase social connectedness among older adults in China, and haptic gloves were used to create real-time touch between physically separated users.
Research on adoption and real-world usage looked at AR laptop interaction patterns highlighting task demands and hybrid workflows, whilst researchers examined barriers to XR healthcare implementation including technical, regulatory, and institutional hurdles.
And finally, user interaction systems were in the spotlight again, with the new methods to improve redirected walking in shared virtual spaces, and voice-controlled AR puppeteering demonstrating the potential for safer, more intuitive robot operation through virtual manipulation that is just very cool.
The Week in 300 (words)
One of the great potentials of XR is to be able to connect people across remote locations. Whilst most Social VR focuses on connections through chatting, researchers in Hong Kong have been looking at connecting older adults in China through playing RemoteChess.To support the users they created a virtual world with two parts to it: the virtual chess area resembling a park where they could play Chinese chess; and a sandbox area for users to explore together. Users felt a strong sense of social connectedness and appreciated the optionality of both forms of interaction. Always nice to bring people together.

A big question for developers is what exactly are the barriers to adoption? This week Morgan, Patel and Campbell have highlighted a range of factors affecting adoption in healthcare, and the list isn't short. It includes factors that you'd expect elsewhere, such as the user experience, cost-benefit analyses, maintenance costs, and training of relevant staff. It also includes healthcare specific factors such as the regulatory burden and ethical concerns. Perhaps the most fascinating is the challenge of creating products for a device that is also available for consumers. They suggest that consumers may simply opt for cheaper lower grade unvalidated ‘equivalent’ experiences, wrecking the efforts of any company in jumping through the many regulatory loops. And they may well have a point.
And finally, we love a good paper about redirected walking and this week we were again treated to a good paper about redirected walking. This time the authors took on the challenges of redirected walking in multi-user environments. Whilst you can redirect multiple users to walk within confined spaces, without bumping into each other, it can be more difficult to control what direction they end up facing in. This can be particularly tricky when you have more than one user and you are expecting them to interact with each other. The solution? They showcased a novel approach that relies on artificial vector potentials and simulated walking pathways. As with all innovations this is still early doors, more research is needed etc etc but it is intriguing to say the least.
Paper of the Week
The paper of the week this week is another preprint, this time from researchers in Sweden. Yes it's not been through peer review but that shouldn't stop us revelling in the idea of controlling robots through a headset and voice commands alone. Yes, you read that right.
Taking advantage of the natural language processing capabilities of modern LLMs, researchers have decided to create a means of controlling robots through issuing voice commands. The authors argue that this hands free experience increases accessibility for those with physical disabilities. However the appeal of operating a robot through verbal instructions, in general, cannot be underestimated.
Not only that, but the physical robot is mirrored with a virtual robot viewed through a Meta Quest 3 allowing the user to ‘puppeteer’ the robot. The physical robot mimicking the virtual one when the user is ready to execute the comments.
The authors highlight the many possible applications from the classics of healthcare and education, to scenarios where safety is paramount or the environments are hazardous.
Now the robot under consideration isn't exactly the Terminator, it is an arm called ‘Blueberry’ that can point at colour patches, but you can't help but think/fear that as robotics continues to progress at pace interacting with a voice commanded humanlike robot puppeteered through a VR headset can't be that far away. Seriously.