Posts

SMAT: Scalable Multi-Agent AI for DT

SMAT: Scalable Multi-Agent Machine Learning and Collaborative AI for Digital Twin Platform of Infrastructure and Facility Operations.

Principal Investigators:

  • Prof. Sam Anand, Department of Mechanical Engineering, CEAS
  • Prof. Ming Tang, Extended Reality Lab, Digital Futures, DAAP

Students: Anuj Gautam, Manish Aryal, Aayush Kumar, Ahmad Alrefai, Rohit Ramesh, Mikhail Nikolaenko, Bozhi Peng

Grant: $40,000. UC Industry 4.0/5.0 Institute Consortium Research Project: 03.2025-01.2026

Read more

AI and Emerging Technology Symposium

Ming Tang and Mikhail Nikolaenko presented “AI-Powered Digital Humans for Enhanced Interaction in Extended Reality” at the AI and Emerging Technology Symposium, University of Cincinnati.

The day-long event explored topics around AI and robotic process automation; smart campus innovation; and extended reality, virtual reality, and augmented reality. More on UC News.

AI-Powered Talking Avatars for Enhanced Interaction in Extended Reality

Presenter. Ming Tang, Mikhail Nikolaenko. Feb. 20, 2025 in Tangeman University Center. 

Read more

VR Training on Issues of Youth Firearm Possession

 Virtual Reality Training on Issues of Youth Firearm Possession.

PI. Tang. 8/5/2024-8/4/2025.

 

Funded by the God.Restoring.Order (GRO) Community, this research project will develop two VR scenarios that simulate environments designed to educate youth on applying critical skills in risky situations.

Team: Ming Tang, XR-Lab, Aaron Mallory, GRO.

XR-Lab students: Aayush Kumar, Mario Bermejo, Jonny Peng, Ahmad Alrefai, Rohit Ramesh, Charlotte Bodie 

The XR-Lab collaborated with the GRO community to leverage advanced extended reality (XR) technologies in the development of a virtual reality (VR) training application designed to strengthen the curriculum by reinforcing key competencies through immersive learning activities. In partnership, we evaluated the feasibility of integrating VR technology into the GRO training program, providing participants with an engaging narrative framework while equipping them with practical knowledge applicable to real-world contexts. The immersive VR scenarios addressed high-risk situations, including firearm possession and substance use, thereby creating a controlled environment for experiential learning and skill development.

The XR-Lab has harnessed advanced motion capture technology in this project to translate the movements of real people into lifelike digital characters. Every gesture, shift in posture, and subtle facial expression is carefully recorded and mapped to ensure authenticity and emotional depth in the virtual environment.

Our development team has worked closely and continuously with the GRO community, engaging in multiple motion studies, rehearsals, and testing sessions. This collaboration allows us to capture not just movement, but the nuance behind each action — the personality conveyed through body language, and the emotional context embedded in facial expression.

Through this process, the digital characters become more than avatars; they become authentic extensions of human experience, reflecting the stories. The result is an immersive, emotionally resonant experience where technology and humanity move together. 

 

paper on XR conference

Two papers were presented and published at the 2024 International Conference on eXtended Reality. XR Salento 2024.

Tang, Ming, Mikhail Nikolaenko, Evv Boerwinkle, Samuel Obafisoye, Aayush Kumar, Mohsen Rezayat, Sven Lehmann, and Tamara Lorenz. “Evaluation of the Effectiveness of Traditional Training vs. Immersive Training: A Case Study of Building Safety & Emergency Training.” Paper presented at the International Conference on eXtended Reality (XR SALENTO 2024), Lecce, Italy, September 4-9, 2024. The paper is published in the Springer Link proceeding book

Virtual Reality (VR) has revolutionized training across healthcare, manufacturing, and service sectors by offering realistic simulations that enhance engagement and knowledge retention. However, assessments that allow for evaluation of the effectiveness of VR training are still sparse. Therefore, we examine VR’s effectiveness in emergency preparedness and building safety, comparing it to traditional training methods. The goal is to evaluate the impact of the unique opportunities VR enables on skill and knowledge development, using digital replicas of building layouts for immersive training experiences. To that end, the research evaluates VR training’s advantages and develops performance metrics by comparing virtual performance with actions in physical reality, using wearable tech for performance data collection and surveys for insights. Participants, split into VR and online groups, underwent a virtual fire drill to test emergency response skills. Findings indicate that VR training boosts urgency and realism perception despite similar knowledge and skill acquisition after more traditional lecture-style training. VR participants reported higher stress and greater effectiveness, highlighting VR’s immersive benefits. The study supports previous notions of VR’s potential in training while also emphasizing the need for careful consideration of its cognitive load and technological demands.

 

Tang, M., Nored, J., Anthony, M., Eschmann, J., Williams, J., Dunseath, L. (2024). VR-Based Empathy Experience for Nonprofessional Caregiver Training. In: De Paolis, L.T., Arpaia, P., Sacco, M. (eds) Extended Reality. XR Salento 2024. Lecture Notes in Computer Science, vol 15028. Springer, Cham. https://doi.org/10.1007/978-3-031-71704-8_28 

This paper presents the development of a virtual reality (VR) system designed to simulate various caregiver training scenarios, with the aim of fostering empathy by providing visual and emotional representations of the caregiver’s experience. The COVID-19 pandemic has increased the need for family members to assume caregiving roles, particularly for older adults who are at high risk for severe complications and death. This has led to a significant reduction in the availability of qualified home health workers. More than six million people aged 65 and older require long-term care, and two-thirds of these individuals receive all their care exclusively from family caregivers. Many caregivers are unprepared for the physical and emotional demands of caregiving, often exhibiting clinical signs of depression and higher stress levels.

The VR system, EVRTalk, developed by a multi-institutional team, addresses this gap by providing immersive training experiences. It incorporates theories of empathy and enables caregivers to switch roles with care recipients, navigating common scenarios such as medication management, hallucinations, incontinence, end-of-life conversations, and caregiver burnout. Research demonstrates that VR can enhance empathy, understanding, and communication skills among caregivers. The development process included creating believable virtual characters and interactive scenarios to foster empathy and improve caregiving practices. Initial evaluations using surveys showed positive feedback, indicating that VR training can reduce stress and anxiety for caregivers and improve care quality.

Future steps involve using biofeedback to measure physiological responses and further investigating the ethical implications of VR in caregiving training. The ultimate goal is to deploy VR training in homes, providing family caregivers with the tools and knowledge to manage caregiving responsibilities more effectively, thereby enhancing the quality of life for both caregivers and care recipients.