The XR Research Digest: Research Roundup logo

The XR Research Digest: Research Roundup

Subscribe
Archives
May 7, 2025

Research Roundup (#24)

Banner Image of Research Papers

Welcome...

Welcome to the twenty-fourth-ever Research Roundup! A weekly catch-up on the latest developments in the field of XR research.

It’s been a busy week, so let's not hang around.             

The Week in 3 (Sentences)

New research on cognitive training for children tested adaptive and non-adaptive executive function training finding potential improvements in one aspect of executive function but not all, whilst experts from five countries highlighted VR's potential for children with special needs.

More research this week on psychological applications with researchers testing craving responses in high-dose benzodiazepine users, whilst a VR simulation of visual impairments increased self-reported empathy.

And finally, research on learning and training compared sequencing strategies with middle school students finding class-first approaches out performing VR-first, whilst a gamified VR system for prostate laser enucleation surgery indicated possible improvements in trainee motivation.

The Week in 300 (words)

Researchers at the University of Chichester, in the UK, reported initial findings looking at both adaptive and non-adaptive VR training of executive function in primary school children (8-11 years old). Despite measuring several different aspects of executive function only cognitive flexibility was found to improve with the adaptive version. The non-significant results were accounted for in terms of potential sample size issues, variability, and design of the training. However, not finding a benefit of adaptive training is becoming a bit of a theme in this digest. One always has to be cautious with null results, but after a while it gets you thinking.

VR continues to be called the ‘Empathy Machine’ for good reason as more research on empathy driven ‘VR for Good’ continues to emerge. This week it was all about making design students more aware of visual impairments. The students were required to complete object related tasks whilst various filters were applied such as tunnel vision, macular degeneration and cataracts. As hoped for, participants reported greater empathy after the simulations. The authors however warned of the dangers of disability simulation: whilst it can increase empathy the one-off short-term experiences can also disguise adaptations that occur, and lead to underestimating actual capabilities. A sort of ‘disability tourism’ if you like.

And finally, researchers in Belgium have been looking at where VR experiences might sit within a lesson plan. Middle school children (10-13 years old) learnt about riding a bike using VRkeer. The researchers varied the combination of teacher-led class instruction and discussion, with the VR experience. Some students had the the class instruction first and then the VR experience, some the other way around, and some just had the VR experience twice. Students in the ‘class-first’ approach performed significantly better than the other two in terms of procedural knowledge and intrinsic motivation. Whilst there will always be limits with the generalisability of such results, it nevertheless provides food for thought and potentially confirms some educators suspicions.

Paper of the Week

The paper of the week this week goes to a fascinating preprint that looked at inattentional blindness when driving with an AR head-up display (HUD).

For the uninitiated, inattentional blindness is a phenomenon where we are less likely to detect unexpected changes in a visual scene if our attentional resources are allocated elsewhere. If you’re not already aware of the Invisible Gorilla then check it out. This can lead to a phenomenon when driving of ‘looking but not seeing’ - you swear you looked to the right but just didn’t see the cyclist.

Even without a HUD this notion of ‘looking but not seeing’ is a cause of many road accidents. Researchers from Virginia Tech have therefore asked whether having an AR HUD is likely to make this better or worse as drivers visual attention gets split between augmented and physical realities in the same position.

Their participants made use of a driving simulator with a virtual head-up display. Their main task was to follow another vehicle on the virtual road. Their second task, performed simultaneously, was a text message task on the HUD that varied in complexity. Their third task was detection of an ‘unexpected’ task stimulus (brake light, child, or wooden target).

Overall it was found that having a HUD didn’t in itself lead to unsafe driving behaviours. However, the complexity of the AR task was related to more missed, or delayed, detections. Interestingly this was most common in the centre of the display, rather than the periphery.

There is obviously nuance here, the effects varied based on complexity, the events weren’t truly unexpected, and there was no comparison to more common head-down displays. However, the most important message is that inattentional blindness still occurs even if the driver is thought to be ‘looking’ at the road more. A head-up display can still lead to the unwanted phenomenon of ‘looking but not seeing’ i.e. inattentional blindness.

AWE USA 2025

Planning on going to AWE USA 2025? Not yet registered? As a reader of the Research Roundup you can get a 10% discount with the code "25XRDIGEST10D".

Find this digest useful?

Forward to a colleague

Contact:

contact@xrdigest.com

Don't miss what's next. Subscribe to The XR Research Digest: Research Roundup:
X Bluesky Buy Me a Coffee
Powered by Buttondown, the easiest way to start and grow your newsletter.