Therapeutic Crisis Intervention Simulation, Phase 3

We are excited to announce the launch of Phase 3 of the VR-Based Employee Safety Training: Therapeutic Crisis Intervention Simulation project, building on the success of the previous two phases. This interdisciplinary collaboration brings together the Immersive Learning Lab and the Employee Safety Learning Lab at Cincinnati Children’s Hospital Medical Center (CCHMC), in partnership with the Extended Reality Lab (XR-Lab) at the University of Cincinnati.

Concept of Digital Twin: Digital Patient + Digital Hospital.

This phase will focus on developing an advanced virtual hospital environment populated with digital patients to simulate a variety of real-world Therapeutic Crisis Intervention (TCI) scenarios. The digital twins encompass both the hospital setting and patient avatars. The project aims to design immersive training modules, capture user performance data, and conduct a rigorous evaluation of the effectiveness of VR-based training in enhancing employee safety and crisis response capabilities

Principal Investigator: Ming Tang. Funding Amount: $38,422. Project Period: April 1, 2025 – December 1, 2026

CCHMC Collaborators: Dr. Nancy Daraiseh, Dr. Maurizio Macaluso, Dr. Aaron Vaughn.

Research Domains: Virtual Reality, Safety Training, Therapeutic Crisis Intervention, Mental Health, Digital Twins, Digital Humans, Human Behavior Simulation.

We look forward to continuing this impactful work and advancing the role of immersive technologies in healthcare education and safety training

CIC-VISTA

Following six successful phases of the Building Safety Analysis with AI / Geospatial Imagery Analytics Research project (2020–2025), funded by the Cincinnati Insurance Companies (CIC), we are pleased to announce the launch of a new research initiative: VISTA – Virtual Immersive Systems for Training AI.

VISTA – Phase 1 ($81,413) marks the continuation of XR-Lab’s collaborative research efforts at UC with CIC. This new track will explore advanced AI-related topics, including computer vision, synthetic imaging, procedural modeling, machine learning, and reinforcement learning.

Project Title: Virtual Immersive Systems for Training AI, Phase 1. PI: Tang. Award Amount: $81,413. Project Period: 07/01/2025 – 11/01/2026

 

Bearcat AI Award

UC Bearcat AI Award Supports XR-Lab Student Innovation in 2025–2026

We are excited to share that two XR-Lab student fellows have been selected for the 2025–2026 Bearcat AI Award to support their cutting-edge research projects:

  • Mikhail Nikolaenko ($4,400) — Integrating Digital Twin and GPT for Sustainable Building Analytics and Green Design Education in DAAP and CEAS

  • Aayush Kumar ($5,000) — INARA (Intelligent Navigation and Autonomous Response Agent): An Adaptive Indoor Navigation Assistant for UC Spaces

These projects reflect the XR-Lab’s ongoing commitment to advancing AI-driven solutions in design, education, and campus experience. Congratulations to both recipients!

Other team memberes: Ming Tang, Semere Abraha, Sid Thatham.

 

 

Restricted Content
To view this protected content, enter the password below:

2025 Faculty Excellence Award

I’m honored to share that I’ve received the 2025 Faculty Excellence Award as part of UC’s Research + Innovation Week, co-sponsored by the Office of Research and the Office of the Provost.

This recognition is truly meaningful to me, as it highlights the value of collaboration, mentorship, and innovation across our academic community. I’m deeply grateful to the colleagues and leadership who nominated me, and to all the students, collaborators, and partners who make this work possible.

Thank you for the continued support—it’s a privilege to be part of such a vibrant and inspiring research environment at UC.

photo source: UC Digital Futures.  Ming Tang with Dr. Keisha Love, Vice Provost for Academic Affairs, and Dr. Patrick Limbach. Vice President for Research.

paper on AI, XR, Metaverse, Digital Twins

Metaverse and Digital Twins in the Age of AI and Extended Reality

Tang, Ming, Mikhail Nikolaenko, Ahmad Alrefai, and Aayush Kumar. 2025. “Metaverse and Digital Twins in the Age of AI and Extended Reality” Architecture 5, no. 2: 36. https://doi.org/10.3390/architecture5020036

 

This paper explores the evolving relationship between Digital Twins (DT) and the Metaverse, two foundational yet often conflated digital paradigms in digital architecture. While DTs function as mirrored models of real-world systems—integrating IoT, BIM, and real-time analytics to support decision-making—Metaverses are typically fictional, immersive, multi-user environments shaped by social, cultural, and speculative narratives. Through several research projects, the team investigate the divergence between DTs and Metaverses through the lens of their purpose, data structure, immersion, and interactivity, while highlighting areas of convergence driven by emerging technologies in Artificial Intelligence (AI) and Extended Reality (XR).This study aims to investigate the convergence of DTs and the Metaverse in digital architecture, examining how emerging technologies—such as AI, XR, and Large Language Models (LLMs)—are blurring their traditional boundaries. By analyzing their divergent purposes, data structures, and interactivity modes, as well as hybrid applications (e.g., data-integrated virtual environments and AI-driven collaboration), this study seeks to define the opportunities and challenges of this integration for architectural design, decision-making, and immersive user experiences. Our research spans multiple projects utilizing XR and AI to develop DT and the Metaverse. The team assess the capabilities of AI in DT environments, such as reality capture and smart building management. Concurrently, the team evaluates metaverse platforms for online collaboration and architectural education, focusing on features facilitating multi-user engagement. The paper presents evaluations of various virtual environment development pipelines, comparing traditional BIM+IoT workflows with novel approaches such as Gaussian Splatting and generative AI for content creation. The team further explores the integration of Large Language Models (LLMs) in both domains, such as virtual agents or LLM-powered Non-Player-Controlled Characters (NPC), enabling autonomous interaction and enhancing user engagement within spatial environments. Finally, the paper argues that DTs and Metaverse’s once-distinct boundaries are becoming increasingly porous. Hybrid digital spaces—such as virtual buildings with data-integrated twins and immersive, social metaverses—demonstrate this convergence. As digital environments mature, architects are uniquely positioned to shape these dual-purpose ecosystems, leveraging AI, XR, and spatial computing to fuse data-driven models with immersive and user-centered experiences.
 
Keywords:  metaverse; digital twin; extended reality; AI