ASTRO Award

Congratulations to XR-Lab Fellow Aayush Kumar on being awarded the 2026 Armstrong ASTRO Fellowship for the Discovery Program by the University of Cincinnati Office of Research.

 

This prestigious fellowship provides $6,000 in support for Aayush’s student-initiated project, INARA (Intelligent Navigation and Autonomous Response Agent). The project explores innovative approaches to autonomous navigation through a hybrid framework that integrates reinforcement learning with rule-based control.

This recognition highlights Aayush’s initiative, technical depth, and commitment to advancing intelligent systems research. We are proud to see his work supported through this competitive fellowship and look forward to the project’s continued development and impact.

AI symposium, Bearcat AI Award

AI & Emerging Technology Symposium, UC Bearcat AI Award,  

We are excited to share that two XR-Lab student fellows have been selected for the 2025–2026 Bearcat AI Award to support their cutting-edge research projects:

  • Mikhail Nikolaenko ($4,400) — Integrating Digital Twin and GPT for Sustainable Building Analytics and Green Design Education in DAAP and CEAS

  • Aayush Kumar ($5,000) — INARA (Intelligent Navigation and Autonomous Response Agent): An Adaptive Indoor Navigation Assistant for UC Spaces

These projects reflect the XR-Lab’s ongoing commitment to advancing AI-driven solutions in design, education, and campus experience. Congratulations to both recipients!

Other team memberes: Ming Tang, Semere Abraha, Sid Thatham.

The projects will be presetned at the UC 2026 AI & Emerging Technology Symposium

02/18/2026

AI-Based Spatial Computing with BIM: Performance, Sustainability, and Wayfinding on the UC Campus
This project introduces an AI-enhanced spatial computing framework that integrates building-scale digital twins with intelligent autonomous navigation. Using BIM-derived geometry and utility metadata, the system combines LLM-assisted building-performance analytics with predictive modeling to support sustainable operations across an interactive, campus-scale digital twin environment. In parallel, we present INARA, a ROS 2–based indoor navigation platform that merges BIM-accurate simulation environments with a hybrid deep-reinforcement-learning and classical-control architecture, enabling safe, adaptive mobile-robot navigation within UC facilities.

Together, these systems advance AI-driven spatial computing by unifying building analytics, embodied intelligence, and digital–physical interoperability—laying the foundation for next-generation smart-building management and autonomous robotic applications.

 

Restricted Content
To view this protected content, enter the password below:

THRED

Technology for Health, Resilience, Equity and Decision-Making 

Team: A&S: Kelly Merrill, Lauren Forbes, Briana Simms, Paris Wheeler, Diego Cuadros, DAAP: Ming Tang

Funding: CCTST ( TBD)

Project Aims: 

Aim 1: Co-design a set of digital data governance policies that reflect Black community preferences, concerns and expectations around the use of their digital health data.  

Aim 2: Assess the utility of digital twin technology (3D city modeling with VR) to serve community advocacy and population health related objectives. 

Aim 3: Co-design and develop a community-driven, population health intervention and participatory planning tool. 

 

Example of Adondale DT: Avondale_Birdseye – PLAYCANVAS

Playable version in Viverse: Avondale in Metaverse

 

Metaverse: Gen-AI + WebXR

Two UC teams won the second and third place and cash awards of Viverse Spark Global University Challenge social experience category!

Metaverse through Web XR 

Soical Expereince CategorAwards:

2nd  Place, $2,500.  – Restaurant world, University of Cincinnati. Camryn Lansdell, Liv Adkins, Bekah Selent, Morgan Pascarella. 

3rd Place, $1,500 – OTR 12th and Vine, University of Cincinnati. Eian Bennett, David David, Corban McIntosh, Dharma Patel, Erik Mathys

Read more at HTC VIVERSE Hits 1 Million Users, Announces Winners of First Global “Spark” Hackathon: Games, Immersive Storytelling, Social Experiences

Read more

ARCH Seminar. Memory of the World

Digital Heritage through VR and Generative AI

ARCH 7036-04/ARCH5051-04.Elective Theory Seminar, SAID, DAAP, Spring 2025

Class time: Monday. 10 am -12:50 pm.  Classroom: 4425-E, CGC Lab, DAAP building. 

Faculty: Ming Tang, Professor in Architecture and Interior Design, DAAP. Director of  Extended Reality Lab. XR-Lab

Seminar Description

This seminar invites students to explore the intersection of architecture, history, artificial intelligence, and immersive technologies by reimagining the past through digital heritage. Drawing on historic documents and archival materials from Cincinnati, students will work closely with historians to reconstruct a Cincinnati neighborhood as it existed in the 1950s, allowing users to step back in time and experience its spaces and stories through immersive visualization and virtual reality (VR).

Throughout the semester, students will investigate how generative AI, virtual reality (VR), and interactive visualization can preserve and reinterpret cultural memory. Using Unreal Engine as the primary platform, participants will design interactive VR environments for mobile headsets, creating spaces where history, atmosphere, and narrative merge into immersive experiences. The seminar aligns with UNESCO’s  Memory of the World initiative, emphasizing the preservation of documentary and architectural heritage for future generations.

In addition to a large collaborative group project, each student will conduct a samll individual research-based design investigation focused on a “lost” historic artifact—such as a forgotten art work or street furniture. Through AI-assisted modeling, reality capture, and digital prototyping, students will gain hands-on experience reconstructing the intangible layers of history while developing advanced technical and conceptual skills in digital heritage creation.

 

Skills covered: Unreal Engine 5, Generative AI tools, Immersive VR development for Meta Quest

Learning Outcomes
By the end of this course, students will be able to:

  • Explain how digital media shapes collective memory, fosters cultural understanding, and expands access to heritage across time and space.
  • Employ generative AI tools to create and refine architectural designs informed by reference data and contextual analysis.
  • Design and integrate high-fidelity virtual reality experiences and audio narratives to enhance spatial storytelling, deployed in the Meta Quest VR headset.
  • Critically analyze and articulate the workflows, challenges, and methodologies involved in digitally reconstructing historic architecture.
  • Develop a generative AI-assisted design project that augments and streamlines the 3D modeling process.

By Caroline McCarthy, DAAP, UC. 2024. Unreal Engine. 

Week 1 – Introduction: Digital Heritage + Course Overview

  • Lecture: Reimagining History through XR and AI
  • Discussion: Historic Cincinnati streets
  • Workshop: Overview of XR technologies, digital twins, and VR hardware (Quest 3 setup)
  • Assignment:  Group organization and individual task

Week 2 – Unreal Engine Foundations

  • Demo: Unreal Engine 5 interface, navigation, and project setup
  • Workshop: Basic environment creation, importing geometry, lighting & material setup
  • Lab: Create a small “street scene” with period references

Week 3 – 3D Assets & Historical References

  • Lecture: Historic Cincinnati
  • Workshop: Importing and optimizing 3D models (AI-generated meshes)
  • Lab: Build the base environment of a Cincinnati street block
  • Assignment: Collect historical images and create a 3D reference board

Week 4 – Generative AI for Architecture

  • Lecture: AI as a Co-Designer in Heritage Reconstruction
  • Demo: Nonbanana + Image/3D AI workflows for architectural texture and concept generation
  • Workshop: AI-to-Unreal pipeline — converting generated images into materials and assets
  • Lab: Generate building facades and signage using AI tools

Week 5 – Advanced Unreal Workflows

  • Lecture: Lighting, Atmosphere, and Period Reconstruction
  • Workshop: Mastering Lighting, Decal, and Quixel Megascans
  • Lab: Create environmental lighting that matches historical mood (day/night, fog, etc.)

Week 6 – Interactivity and Blueprints

  • Lecture: From Scene to Experience – Building Interaction in Unreal
  • Workshop: Unreal Blueprints for triggers, movement, and simple interactions
  • Lab: Create an interactive object or trigger zone (e.g., play sound)

Week 7 – Midterm Critique

  • In-class: Presentation of progress on group and individual projects
  • Peer + Instructor Feedback on historical accuracy, visual quality, and interactivity
  • Workshop: Optimize scenes for Quest VR performance

Week 8 – Audio, Narrative & Spatial Storytelling

  • Lecture: Sound as Memory – Integrating Audio Narratives
  • Workshop: Implement 3D spatial audio and voice narration in Unreal
  • Lab: Add sound cues tied to specific historical events or locations

Week 9 – VR Deployment for Quest

  • Workshop: Unreal project packaging and optimization for Meta Quest 3
  • Lab: Build and test interactive VR scenes on headsets
  • Debug session: Frame rate, lighting, and control issues
  • Milestone: Playable VR prototype

Week 10 -14 – working time

  • Workshop: Optimize lighting and build levels for immersive storytelling
  • Check-in: Instructor reviews individual and group progress
  • Lab: Full-scale integration – environment, audio, and narrative flow

Week 15 – Final Exhibition & Reflection

  • Public or in-class VR showcase: Historic Cincinnati in Virtual Reality
  • Final critique and documentation

Reference

Previos courses taught using Unreal Engine for VR

Publications