Posts

paper on XR conference

Two papers were accepted in the 2024 International Conference on eXtended Reality. XR Salento 2024.

Tang, Ming, Mikhail Nikolaenko, Evv Boerwinkle, Samuel Obafisoye, Aayush Kumar, Mohsen Rezayat, Sven Lehmann, and Tamara Lorenz. “Evaluation of the Effectiveness of Traditional Training vs. Immersive Training: A Case Study of Building Safety & Emergency Training.” Paper accepted at the International Conference on eXtended Reality (XR SALENTO 2024), Lecce, Italy, September 4-9, 2024.

Virtual Reality (VR) has revolutionized training across healthcare, manufacturing, and service sectors by offering realistic simulations that enhance engagement and knowledge retention. However, assessments that allow for evaluation of the effectiveness of VR training are still sparse. Therefore, we examine VR’s effectiveness in emergency preparedness and building safety, comparing it to traditional training methods. The goal is to evaluate the impact of the unique opportunities VR enables on skill and knowledge development, using digital replicas of building layouts for immersive training experiences. To that end, the research evaluates VR training’s advantages and develops performance metrics by comparing virtual performance with actions in physical reality, using wearable tech for performance data collection and surveys for insights. Participants, split into VR and online groups, underwent a virtual fire drill to test emergency response skills. Findings indicate that VR training boosts urgency and realism perception despite similar knowledge and skill acquisition after more traditional lecture-style training. VR participants reported higher stress and greater effectiveness, highlighting VR’s immersive benefits. The study supports previous notions of VR’s potential in training while also emphasizing the need for careful consideration of its cognitive load and technological demands.

 

Tang, Ming, Jai’La Nored, Matt Anthony, Judy Eschmann, Jennifer Williams, and Linda Dunseath. “VR-based Empathy Experience for Nonprofessional Caregiver Training.” Paper accepted at the International Conference on eXtended Reality (XR SALENTO 2024), Lecce, Italy, September 4-9, 2024.

This paper presents the development of a virtual reality (VR) system designed to simulate various caregiver training scenarios, with the aim of fostering empathy by providing visual and emotional representations of the caregiver’s experience. The COVID-19 pandemic has increased the need for family members to assume caregiving roles, particularly for older adults who are at high risk for severe complications and death. This has led to a significant reduction in the availability of qualified home health workers. More than six million people aged 65 and older require long-term care, and two-thirds of these individuals receive all their care exclusively from family caregivers. Many caregivers are unprepared for the physical and emotional demands of caregiving, often exhibiting clinical signs of depression and higher stress levels.

The VR system, EVRTalk, developed by a multi-institutional team, addresses this gap by providing immersive training experiences. It incorporates theories of empathy and enables caregivers to switch roles with care recipients, navigating common scenarios such as medication management, hallucinations, incontinence, end-of-life conversations, and caregiver burnout. Research demonstrates that VR can enhance empathy, understanding, and communication skills among caregivers. The development process included creating believable virtual characters and interactive scenarios to foster empathy and improve caregiving practices. Initial evaluations using surveys showed positive feedback, indicating that VR training can reduce stress and anxiety for caregivers and improve care quality.

Future steps involve using biofeedback to measure physiological responses and further investigating the ethical implications of VR in caregiving training. The ultimate goal is to deploy VR training in homes, providing family caregivers with the tools and knowledge to manage caregiving responsibilities more effectively, thereby enhancing the quality of life for both caregivers and care recipients.

 

Paper in AHFE conference

Nancy Daraiseh, Ming Tang, Mikhail Nikolaenko . Using Virtual Reality to Enhance Behavioral Staff Training for Interacting with Aggressive Psychiatric Patients. The 15th International Conference on Applied Human Factors and Ergonomics (AHFE 2024). Nice, France, July 24-27, 2024.

Objective: To conduct a pilot study to enhance staff training and confidence when interacting with aggressive psychiatric patients using a virtual reality (VR) training module depicting an escalating patient scenario.

Significance: Dysregulated emotional outbursts, reactive aggression, and self-injurious behaviors are common in psychiatrically hospitalized patients. These behaviors result in aggressive patient interactions (APIs) which are associated with increased risk of harm to the patient and staff. Minimal research has examined interventions for successful training to effectively reduce or prevent API events and subsequent harm. Despite intensive, standardized trainings in crisis de-escalation protocols, staff continue to experience high rates of API injuries. More realistic training and competency in a safe environment to practice implementation and utilization of de-escalation strategies to avoid APIs and patient harm are needed.

Methods Using a pre – post, quasi-experimental design, 40 Behavioral Health Specialists and Registered Nurses at a pediatric psychiatric facility will participate in VR training depicting a commonly experienced scenario when interacting with an aggressive patient. Participants are stratified by job experience, sex, and VR experience. Study aims are to: i) assess the feasibility and usability of VR training among this population and ii) obtain measures of learner satisfaction and performance. Surveys measure usability, learner satisfaction, and coping with patient aggression. Pre- and post-performance in training will be compared and assessed by percent correct answers on the first attempt; time to correct answer; and the number of successful and unsuccessful attempts.

Preliminary Results (full analyses in progress): Preliminary survey results (N=14) show that 64% perceived the VR experience to be consistent with their real-world experiences: 87% agree that the VR training would help with interactions with aggressive patients: 71% reported the training was effective in identifying de-escalation strategies: 79% stated the training was effective in recognizing stages of patient crisis; training included important skills used in their job; and would recommend the training. Finally, 100% would participate in future VR trainings.

Anticipated Conclusions: We plan to show that using VR to supplement in-place training programs for high-risk situations can improve users’ understanding of essential de-escalation and crisis techniques. We anticipate results will show an enhanced ability and confidence when interacting with aggressive patients. Future studies will expand on results and examine implications on staff and patient harm. 

Check more information on the  VR-based Employee Safety Training. Therapeutic Crisis Intervention Simulation 

Honors Seminar student projects

“Human-Computer Interaction in the Age of Extended Reality & Metaverse” student projects

Spring. 2024.  UC

Under the guidance of Ming Tang, Director of the XR-Lab at Digital Futures and DAAP, UC, this honors seminar course has propelled students through an immersive journey into the realm of XR. The course encompasses Extended Reality, Metaverse, and Digital Twin technologies, providing a comprehensive platform for theoretical exploration and practical application in XR development.

The coursework showcases an array of student-led research projects that investigate the role of XR in various domains, including medical training, flight simulation, entertainment, tourism, cultural awareness, fitness, and music. Through these projects, students have had the opportunity to not only grasp the intricate theories underpinning future HCI developments but also to apply their skills in creating immersive experiences that hint at the future of human-technology interaction.

 

 “Human-Computer Interaction in the Age of Extended Reality & Metaverse” is a UC Honors course that delves into the burgeoning field of extended reality (XR) and its confluence with human-computer interaction (HCI), embodying a fusion of scholarly inquiry and innovative practice.

Ming Tang, Professor, Director of XR-Lab, DAAP, University of Cincinnati

Students: Nishanth Chidambaram, Bao Huynh, Caroline McCarthy, Cameron Moreland, Frank Mularcik, Cooper Pflaum, Triet Pham, Brooke Stephenson, Pranav Venkataraman

Thanks for the support from the UC Honors Program and UC Digital Futures.

Fluid Sim in VR

Fluid Simulation in Virtual Reality. Unreal Engine. Collision test with hand. 

paper SpaceXR in HCI 2024

SpaceXR: Virtual Reality and Data Mining for Astronomical Visualization ” is published at the 26th HCI International Conference. Proceeding Book.  Washington DC, USA. 29 June – 4 July 2024
Authors: Mikhail Nikolaenko, Ming Tang

 

Abstract

This paper presents a ” SpaceXR ” project that integrates data science, astronomy, and Virtual Reality (VR) technology to deliver an immersive and interactive educational tool. It is designed to cater to a diverse audience, including students, academics, space enthusiasts, and professionals, offering an easily accessible platform through VR headsets. This VR application offers a data-driven representation of celestial bodies, including planets and the sun within our solar system, guided by data from the NASA and Gaia databases. The VR application empowers users with interactive capabilities encompassing scaling, time manipulation, and object highlighting. The potential applications span from elementary educational contexts, such as teaching the star system in astronomy courses, to advanced astronomical research scenarios, like analyzing spectral data of celestial objects identified by Gaia and NASA. By adhering to emerging software development practices and employing a variety of conceptual frameworks, this project yields a fully immersive, precise, and user-friendly 3D VR application that relies on a real, publicly available database to map celestial objects. 

Check more project details on Solar Systems in VR. 

 

Mikhail Nikolaenko presented the paper at the 26th HCI International Conference. Washington DC, USA. 29 June – 4 July 2024