Posts

paper on XR conference

Two papers were presented and published at the 2024 International Conference on eXtended Reality. XR Salento 2024.

Tang, Ming, Mikhail Nikolaenko, Evv Boerwinkle, Samuel Obafisoye, Aayush Kumar, Mohsen Rezayat, Sven Lehmann, and Tamara Lorenz. “Evaluation of the Effectiveness of Traditional Training vs. Immersive Training: A Case Study of Building Safety & Emergency Training.” Paper presented at the International Conference on eXtended Reality (XR SALENTO 2024), Lecce, Italy, September 4-9, 2024. The paper is published in the Springer Link proceeding book

Virtual Reality (VR) has revolutionized training across healthcare, manufacturing, and service sectors by offering realistic simulations that enhance engagement and knowledge retention. However, assessments that allow for evaluation of the effectiveness of VR training are still sparse. Therefore, we examine VR’s effectiveness in emergency preparedness and building safety, comparing it to traditional training methods. The goal is to evaluate the impact of the unique opportunities VR enables on skill and knowledge development, using digital replicas of building layouts for immersive training experiences. To that end, the research evaluates VR training’s advantages and develops performance metrics by comparing virtual performance with actions in physical reality, using wearable tech for performance data collection and surveys for insights. Participants, split into VR and online groups, underwent a virtual fire drill to test emergency response skills. Findings indicate that VR training boosts urgency and realism perception despite similar knowledge and skill acquisition after more traditional lecture-style training. VR participants reported higher stress and greater effectiveness, highlighting VR’s immersive benefits. The study supports previous notions of VR’s potential in training while also emphasizing the need for careful consideration of its cognitive load and technological demands.

 

Tang, M., Nored, J., Anthony, M., Eschmann, J., Williams, J., Dunseath, L. (2024). VR-Based Empathy Experience for Nonprofessional Caregiver Training. In: De Paolis, L.T., Arpaia, P., Sacco, M. (eds) Extended Reality. XR Salento 2024. Lecture Notes in Computer Science, vol 15028. Springer, Cham. https://doi.org/10.1007/978-3-031-71704-8_28 

This paper presents the development of a virtual reality (VR) system designed to simulate various caregiver training scenarios, with the aim of fostering empathy by providing visual and emotional representations of the caregiver’s experience. The COVID-19 pandemic has increased the need for family members to assume caregiving roles, particularly for older adults who are at high risk for severe complications and death. This has led to a significant reduction in the availability of qualified home health workers. More than six million people aged 65 and older require long-term care, and two-thirds of these individuals receive all their care exclusively from family caregivers. Many caregivers are unprepared for the physical and emotional demands of caregiving, often exhibiting clinical signs of depression and higher stress levels.

The VR system, EVRTalk, developed by a multi-institutional team, addresses this gap by providing immersive training experiences. It incorporates theories of empathy and enables caregivers to switch roles with care recipients, navigating common scenarios such as medication management, hallucinations, incontinence, end-of-life conversations, and caregiver burnout. Research demonstrates that VR can enhance empathy, understanding, and communication skills among caregivers. The development process included creating believable virtual characters and interactive scenarios to foster empathy and improve caregiving practices. Initial evaluations using surveys showed positive feedback, indicating that VR training can reduce stress and anxiety for caregivers and improve care quality.

Future steps involve using biofeedback to measure physiological responses and further investigating the ethical implications of VR in caregiving training. The ultimate goal is to deploy VR training in homes, providing family caregivers with the tools and knowledge to manage caregiving responsibilities more effectively, thereby enhancing the quality of life for both caregivers and care recipients.

 

CVG HOLO

CVG-HOLO – WAYFINDING HOLOGRAM PROJECT

XR-Lab is working with Cincinnati/Northern Kentucky International Airport (CVG), in collaboration with UC Center for Simulations & Virtual Environments Research, to

  1. Develop and demonstrate a wayfinding hologram.
  2. Evaluate the hologram signage’s performance to augment passengers’ wayfinding experience.
  3. Develop concepts of Concourse-B store renovation, integrating emerging digital technologies related to Extended Reality
  4. Develop a digital twin model of the CVG Concourse-B store area.

The project will apply various methods, including eye-tracking, motion capture, motion tracking, and computer vision.

Hologram. Reference Image from SVG news. 10.2023

Project Client: Josh Edwards, Sr. Manager, Innovation Cincinnati/Northern Kentucky International Airport

UC Team:

  • eXtended Reality Lab: Ming Tang, Director eXtended Reality Lab Digital Futures tangmg@ucmail.uc.edu
  • UCSIM Project Lead: Chris M. Collins.  Director. Center for Simulations & Virtual Environments Research
  • ARCH 7014 students. Fall. 2023

concept of hologram in CVG. by students in ARCH 7014. Fall 2023, UC. 

Thanks for the support from the UHP Discovery Summer program. 

Check out more on the student projects and eye-tracking analysis on CVG renovation.  or way-finding research projects and publications at XR-Lab. 

Wayfinding through VR

Use VR walkthrough for wayfinding research. Players’ routes, and walking behavior, such as head movement, are captured and evaluated.

Credit: restaurant designed by Eian Bennett.
More info on the wayfinding and Egress at the simulated DAAP building can be found here.

 

Industry 4.0/5.0 grant

 

Immersive vs. Traditional Training​ – a comparison of training modalities​

PIs: Tamara Lorenz, Ming Tang

  • Dr. Tamara Lorenz. Associate Professor. Embodied Interactive Systems Lab, Industry 4.0 & 5.0 Institute (I45I), Center for Cognition, Action, and Perception (CAP)
  • Ming Tang. Professor. Extended Reality Lab, Industry 4.0 & 5.0 Institute (I45I), Institute for Research in Sensing (IRiS)

Consortium Research Project: evaluate the effectiveness of an immersive training protocol against different traditional training modalities. 

Grant. $40,000. By UC Industry 4.0/5.0 Institute 01.2023-01.2024

Open Questions

  • Is immersive training equally as effective or better than traditional training? 
  • Is immersive training beneficial for specific types of training (skill, behavior), while other modalities are better for other types (e.g. knowledge acquisition)?
  • Does the benefit of immersive VR training warrant the initial investment in equipment and subsequent investment in project building, running, and sustenance?

Proposal

  • Evaluation of the effectiveness of an immersive training protocol against different traditional training modalities. 
  • Evaluation of modality-dependent benefits for different learning goals. 
  • Derivation of assessment metrics for VR training against other training modalities. 

Training scenario: DAAP Fire Evacuation

traditional training with slides and maps.

VR training with an immersive and interactive experience.

 

 

Thanks to the Institute’s Industrial Advisory Board (IAB) and industry patrons, including Siemens, Kinetic Vision, John Deere, Stress Engineering Services, Innovative Numberics, and Ethicon. 

Next Phase experiments

Multi-player test



 

Links

 2017 Virtual DAAP Fire Evacuation project.

 

At UC News

New UC institute looks ahead to ‘Industry 5.0’. UC will harness collective talent across campus to help companies solve new challenges. by Michael Miller.  December 8, 2022

 

 

paper on IJSW

Ming Tang and Adekunle Adebisi’s paper titled Using Eye-Tracking for Traffic Control Signage Design at Highway Work Zone is published in the Interdisciplinary Journal of Signage and Wayfinding. 

Tang, M. Adebisib, A. Using Eye-Tracking for Traffic Control Signage Design at Highway Work Zone. Interdisciplinary Journal of Signage and Wayfinding.  Vol. 6, No. 2 (2022)

This paper discusses the application of Eye Tracking (ET) technologies for researchers to understand a driver’s perception of signage at the highway work zone. Combining ET with screen-based motion pictures and a driving simulator, the team developed an analytical method that allowed designers to evaluate signage design. Two experiments were set up to investigate how signage design might affect a driver’s visual attention and interaction under various environmental complexities and glare conditions. The study explores visual perception related to several spatial features, including signage modality, scene complexity, and color schemes. The ET method utilizes total fixation time and time-to-first fixation data to evaluate the effectiveness of signages presented through screen-based video and a driving simulator.

Keywords: Eye-tracking, Signage design, Work zone safety

about the IJSW journal

Signage and wayfinding are critical components of the urban landscape. In spite of their importance, there has been no journal or comprehensive scholarly platform dedicated to this topic. As such, scholars from a variety of academic disciplines (law, planning, engineering, business, art, economics, architecture, landscape architecture, industrial design, and graphic design) publish work in journals within their home disciplines and rarely have a chance to communicate their cross-disciplinary findings. The Interdisciplinary Journal of Signage and Wayfinding seeks to bring them together.

Sponsored by the Academic Advisory Council for Signage Research and Education (AACSRE), this online, open access journal seeks to be the home for scholarship in the field of signage and wayfinding, and to make such scholarship accessible to academics and practitioners alike.