Posts

AI symposium, Bearcat AI Award

AI & Emerging Technology Symposium, UC Bearcat AI Award,  

We are excited to share that two XR-Lab student fellows have been selected for the 2025–2026 Bearcat AI Award to support their cutting-edge research projects:

  • Mikhail Nikolaenko ($4,400) — Integrating Digital Twin and GPT for Sustainable Building Analytics and Green Design Education in DAAP and CEAS

  • Aayush Kumar ($5,000) — INARA (Intelligent Navigation and Autonomous Response Agent): An Adaptive Indoor Navigation Assistant for UC Spaces

These projects reflect the XR-Lab’s ongoing commitment to advancing AI-driven solutions in design, education, and campus experience. Congratulations to both recipients!

Other team memberes: Ming Tang, Semere Abraha, Sid Thatham.

The projects will be presetned at the UC 2026 AI & Emerging Technology Symposium

02/18/2026

AI-Based Spatial Computing with BIM: Performance, Sustainability, and Wayfinding on the UC Campus
This project introduces an AI-enhanced spatial computing framework that integrates building-scale digital twins with intelligent autonomous navigation. Using BIM-derived geometry and utility metadata, the system combines LLM-assisted building-performance analytics with predictive modeling to support sustainable operations across an interactive, campus-scale digital twin environment. In parallel, we present INARA, a ROS 2–based indoor navigation platform that merges BIM-accurate simulation environments with a hybrid deep-reinforcement-learning and classical-control architecture, enabling safe, adaptive mobile-robot navigation within UC facilities.

Together, these systems advance AI-driven spatial computing by unifying building analytics, embodied intelligence, and digital–physical interoperability—laying the foundation for next-generation smart-building management and autonomous robotic applications.

 

Restricted Content
To view this protected content, enter the password below:

Therapeutic Crisis Intervention Simulation, Phase 3

We are excited to announce the launch of Phase 3 of the VR-Based Employee Safety Training: Therapeutic Crisis Intervention Simulation project, building on the success of the previous two phases. This interdisciplinary collaboration brings together the Immersive Learning Lab and the Employee Safety Learning Lab at Cincinnati Children’s Hospital Medical Center (CCHMC), in partnership with the Extended Reality Lab (XR-Lab) at the University of Cincinnati. 

This phase will focus on developing an advanced virtual hospital environment populated with digital patients to simulate a variety of real-world Therapeutic Crisis Intervention (TCI) scenarios. The digital twins encompass both the hospital setting and patient avatars. The project aims to design immersive training modules, capture user performance data, and conduct a rigorous evaluation of the effectiveness of VR-based training in enhancing employee safety and crisis response capabilities

Principal Investigator: Ming Tang. Funding Amount: $38,422. Project Period: April 1, 2025 – December 1, 2026

CCHMC Collaborators: Dr. Nancy Daraiseh, Dr. Maurizio Macaluso, Dr. Aaron Vaughn.

Research Domains: Virtual Reality, Safety Training, Therapeutic Crisis Intervention, Mental Health, Digital Twins, Digital Humans, Human Behavior Simulation.

We look forward to continuing this impactful work and advancing the role of immersive technologies in healthcare education and safety training

Concept of Digital Twin: Digital Patient + Digital Hospital.

CIC-VISTA

Following six successful phases of the Building Safety Analysis with AI / Geospatial Imagery Analytics Research project (2020–2025), funded by the Cincinnati Insurance Companies (CIC), we are pleased to announce the launch of a new research initiative: VISTA – Virtual Immersive Systems for Training AI.

VISTA – Phase 1 ($81,413) marks the continuation of XR-Lab’s collaborative research efforts at UC with CIC. This new track will explore advanced AI-related topics, including computer vision, synthetic imaging, procedural modeling, machine learning, and reinforcement learning.

Project Title: Virtual Immersive Systems for Training AI, Phase 1. PI: Tang. Award Amount: $81,413. Project Period: 07/01/2025 – 11/01/2026

 

SMAT: Scalable Multi-Agent AI for DT

SMAT: Scalable Multi-Agent Machine Learning and Collaborative AI for Digital Twin Platform of Infrastructure and Facility Operations.

Principal Investigators:

  • Prof. Sam Anand, Department of Mechanical Engineering, CEAS
  • Prof. Ming Tang, Extended Reality Lab, Digital Futures, DAAP

Students: Anuj Gautam, Manish Aryal, Aayush Kumar, Ahmad Alrefai, Rohit Ramesh, Mikhail Nikolaenko, Bozhi Peng

Grant: $40,000. UC Industry 4.0/5.0 Institute Consortium Research Project: 03.2025-01.2026

Read more

P&G VISM

Project VISM (Virtual Interactive Simulation for Material customization)
Interactive Visualization with User-Controlled, Procedural-Based, and Physical-Based Material Customization.

PI. Ming Tang.

P&G Team: Kim Jackson, Andrew Fite, Fei Wang, Allison Roman

UC Team: Ming Tang, Aayush Kumar, Yuki Hirota

Sponsor: P&G. 12/01/2024 – 5/31/2025

Amount: $28,350


Restricted Content
To view this protected content, enter the password below: