Posts

Gensler’s AI Award

SENSE-AI – Developing Spatial Experiences for Narrative and Sensory Emotions with AI. 
Gensler
. AI Excellence in the Design Process Award.
PI: Tang. Amount: $100,000. Period: 06, 2026- 06.2027.

I’m very grateful to share that UC’s proposal “SENSE-AI: Developing Spatial Experiences for Narrative and Sensory Emotions with AI” has received Gensler’s AI Excellence in the Design Process Award. It is truly an honor, and we deeply appreciate Gensler’s support. 

We will investigate how emerging technologies can be used in the academic setting to support human-centered spatial prototypes through iterative cycles of ideation, AI-assisted synthesis, immersive testing, and narrative refinement. We will also explore how AI can function as a design collaborator throughout the full design process—from site analysis and early concepts to 3D modeling, animation, performance evaluation, and immersive representation—using AI and XR to experiment with new forms of architectural experience.

We are excited about the opportunity to share our research and teaching of AI with Gensler architects and to learn together about how emerging AI tools are shaping the future of professional design practice.

Reference

Gensler is a global architecture, design, and planning firm. Founded in 1965, Gensler has built a team of 6,000 professionals who partner with clients in over 100 countries each year on projects that act as catalysts for growth.

Design as Storytelling: How AI Is Transforming the Way We Imagine, Create, and Connect. Jordan Goldstein. Gensler blog.

 

Historic Avondale in VR

Virtual Reality Development for Historic Avondale and Digital Heritage 

Principal Investigator (PI):  Anne Delano Steinert, Co-PI: Ming Tang.
Society and Culture Grant, UC Research Office.  2026. University of Cincinnati.

Timeline: 1/13/2026-07/01/2026 


Project Overview

The Historic Avondale Digital Heritage Project aims to create an immersive Virtual Reality (VR) reconstruction of Cincinnati’s historic Avondale neighborhood, focusing on cultural heritage preservation and community engagement. Through a collaborative effort between A&S and DAAP’s XR-Lab, the project will visualize approximately six urban blocks of Avondale as they appeared in a historically significant period in 1950s, enabling users to explore the environment through VR headsets and interact with its digital elements. 

This initiative supports experiential learning, public history dissemination, and digital preservation. The final VR experience will be designed for accessibility on standalone VR headsets (Meta Quest) and optimized for community outreach, classroom education, and museum-style exhibition. 

 

 Scope of Work

The Extended Reality Lab (XR-Lab) will lead the technical development, 3D modeling, and immersive environment creation. 
Specific responsibilities include: 

  • Environment Modeling: Develop a high-fidelity 3D digital model of approximately six blocks of historic Avondale, including building facades, streetscapes, and key landscape features based on historical references. 
  • Visual Assets: Integrate architectural elements, materials, vegetation, street furniture, and atmospheric lighting to accurately represent the period context. 
  • Immersive Interactivity: Configure the scene for full VR immersion, allowing users to navigate and interact using Meta Quest headsets. 
  • Narrative Integration: Collaborate with the A&S team to embed audio narratives, historical commentary, and interpretive storytelling within the VR experience. 
  • Optimization & Testing: Optimize performance for standalone VR headsets, conduct iterative testing, and ensure accessibility and stability for public demonstration. 
  • Deployment: Package and deliver a functional VR build ready for viewing via Meta Quest Browser or as an installable application. 

 Deliverables

  • VR Model of Historic Avondale – A completed, interactive VR environment covering approximately six city blocks. 
  • High-Fidelity Visual Assets – Accurately modeled architecture, landscape, and environmental elements. 
  • Integrated Audio Narratives – Recorded and embedded voiceovers provided by A&S collaborators. 
  • VR-Ready Application – Optimized and deployable version compatible with Meta Quest 3 headsets. 
  • Documentation Package – Summary of technical workflow, file structure, and recommendations for future updates or expansion. 

 

Figure 1. Coloraization of a historic photo with AI. By student Mario Bermejo.

Figure 2. video clips based on historic photo, using AI. 

Figure 3. Jewish Temple in Avondale. 1950s

ASTRO Award

Congratulations to XR-Lab Fellow Aayush Kumar on being awarded the 2026 Armstrong ASTRO Fellowship for the Discovery Program by the University of Cincinnati Office of Research.

 

This fellowship provides $6,000 in support for Aayush’s project, INARA (Intelligent Navigation and Autonomous Response Agent). The project explores innovative approaches to autonomous navigation through a hybrid framework that integrates reinforcement learning with rule-based control.

AI symposium, Bearcat AI Award

AI & Emerging Technology Symposium, UC Bearcat AI Award,  

Presentation at the UC 2026 AI & Emerging Technology Symposium on 02/18/2026 at UC TUC Center.

AI-Based Spatial Computing with BIM: Performance, Sustainability, and Wayfinding on the UC Campus

This project introduces an AI-enhanced spatial computing framework that integrates building-scale digital twins with intelligent autonomous navigation. Using BIM-derived geometry and utility metadata, the system combines LLM-assisted building-performance analytics with predictive modeling to support sustainable operations across an interactive, campus-scale digital twin environment. In parallel, we present INARA, a ROS 2–based indoor navigation platform that merges BIM-accurate simulation environments with a hybrid deep-reinforcement-learning and classical-control architecture, enabling safe, adaptive mobile-robot navigation within UC facilities.

Together, these systems advance AI-driven spatial computing by unifying building analytics, embodied intelligence, and digital–physical interoperability—laying the foundation for next-generation smart-building management and autonomous robotic applications.

Read more

Therapeutic Crisis Intervention Simulation, Phase 3

<href=”http://ming3d.com/new/wp-content/uploads/2025/07/CCHMCP3-1.jpg”>We are excited to announce the launch of Phase 3 of the VR-Based Employee Safety Training: Therapeutic Crisis Intervention Simulation project, building on the success of the previous two phases. This interdisciplinary collaboration brings together the Immersive Learning Lab and the Employee Safety Learning Lab at Cincinnati Children’s Hospital Medical Center (CCHMC), in partnership with the Extended Reality Lab (XR-Lab) at the University of Cincinnati. 

This phase will focus on developing an advanced virtual hospital environment populated with digital patients to simulate a variety of real-world Therapeutic Crisis Intervention (TCI) scenarios. The digital twins encompass both the hospital setting and patient avatars. The project aims to design immersive training modules, capture user performance data, and conduct a rigorous evaluation of the effectiveness of VR-based training in enhancing employee safety and crisis response capabilities

Principal Investigator: Ming Tang. Funding Amount: $38,422. Project Period: April 1, 2025 – December 1, 2026

CCHMC Collaborators: Dr. Nancy Daraiseh, Dr. Maurizio Macaluso, Dr. Aaron Vaughn.

Research Domains: Virtual Reality, Safety Training, Therapeutic Crisis Intervention, Mental Health, Digital Twins, Digital Humans, Human Behavior Simulation.

We look forward to continuing this impactful work and advancing the role of immersive technologies in healthcare education and safety training

Concept of Digital Twin: Digital Patient + Digital Hospital.