Featured Projects

UC UPRISE

XR-Lab’s project was selected to join the Undergraduates Pursuing Research in Science and Engineering (UPRISE) Program
UNDERGRADUATE SUMMER RESEARCH PROGRAM IN SCIENCE AND ENGINEERING
May 6 – July 26, 2024

 

Project title: Reinforcement Learning (RL) system in Game Engine

Student: Mikhail Nikolaenko. UC. PI: Ming Tang. 

This project proposes the development of a sophisticated reinforcement learning (RL) system utilizing the robust and versatile environment of Unreal Engine 5 (UE5). The primary objective is to create a flexible and highly realistic simulation platform that can model a multitude of real-life scenarios, ranging from object recognition, urban navigation to emergency response strategies. This platform aims to significantly advance the capabilities of RL algorithms by exposing them to complex, diverse, and dynamically changing environments. Leveraging the advanced graphical and physical simulation capabilities of UE5, the project will focus on creating detailed and varied scenarios in which RL algorithms can be trained and tested. These scenarios will include, but not be limited to, urban traffic systems, natural disaster simulations, and public safety response models. The realism and intricacy of UE5’s environment will provide a challenging and rich training ground for RL models, allowing them to learn and adapt to unpredictable variables akin to those in the real world.

Synthetic material for AI training

Using a game engine to generate synthetic training data offers significant advantages for AI training in image classification, object detection, and animation classification. Synthetic data, created in controlled virtual environments, allows for generating large, diverse, and perfectly labeled datasets. This contrasts with human-labeled material, which is often expensive, time-consuming, and prone to errors. Synthetic data can be tailored to specific needs, ensuring comprehensive coverage of various scenarios and edge cases that might be underrepresented in real-world data. Additionally, synthetic environments facilitate the manipulation of variables, such as lighting and angles, providing a robust training ground for AI models. Overall, the use of synthetic material enhances the efficiency and accuracy of AI training compared to traditional human-labeled datasets.

Example

The Roof AI Training project focuses on training an AI system using Unreal Engine 5 (UE5) to generate synthetic data for machine learning applications, specifically for detecting damaged asphalt shingles on roofs. It highlights the need for massive labeled datasets to train AI effectively and discusses the role of Convolutional Neural Networks (CNNs) in image recognition tasks. UE5 is utilized to create procedurally generated roofing textures and environments, providing a cost-effective and scalable solution for data generation. The research project involves generating high-quality synthetic images of roofs, including damaged areas, to train and evaluate a CNN model. The AI aspect includes using Simplex Noise and 3-tab pattern algorithms for realistic texture generation, followed by training a CNN using PyTorch on the generated data. Preliminary results indicate that the model accurately detects most missing shingles, though further improvements are needed in bounding box precision and detection accuracy. Future goals include enhancing the procedural generation algorithm, optimizing the CNN architecture for better performance, and iteratively refining the AI through continuous training and evaluation using both synthetic and real-world data.

 

Related projectsBuilding Safety Analysis with Machine Learning

Digital Twin, LLM & IIOT

IIOT for legacy and intelligent factory machines with XR and LLM feedback with a Digital Twin demonstration of real-time IOT for architecture/building applications using Omniverse.

  • PIs: Sam AnandMing Tang

$40,000. UC Industry 4.0/5.0 Institute Consortium Research Project: 01.2024-01.2025

Environment Sensors for Digital Twin model. XR-Lab and SM-Lab at Digital Futures Building.

Integration of Reality capture, IOT, LLM into a digital twin model.  

 

Digital Twin of Digital Futures Building. 

Primary Objective: To develop a conversational large language modeling system that acquires data from legacy machines, digital machines, environmental data, real-time data, and historical data within an IIoT environment to create a digital twin for assisting in real-time maintenance and assistance (Application Use Case: Digital Future’s Building) 


1. Autodesk DT

Live Demo with Autodesk DT.  Please contact Ming Tang to request a password. 

  


2. Unreal DT

Live demo with pixel stream. This is designed for mobile devices with a touch screen, so users can use two rings on the screen to view and navigate space.


Demonstration videos

 

The original BIM model was provided by GBBN and Messer Construction. Copyright 2024.

Student: Sourabh Deshpande, Anuj Gautam , Manish Raj Aryal, Mikhail Nikolaenko, Aayush Kumar, Eian Bennett, Ahmad Alrefai

Cloud-based Digital Twin

Clients are one click away from interacting with a Digital Twin model on their personal devices. No installation is required.

The XR-Lab’s project showcases a cloud-based Digital Twin (DT) model, designed for accessibility and interaction via mobile devices. This advanced DT model allows multiple users to engage with its complex features directly through touch screens, eliminating the need for app installations. Clients can effortlessly access the content using a simple URL in a web browser on their personal iOS or Android mobile devices and tablets. The project is distinguished by its photorealistic renderings, which are streamed to clients at high frame rates, ensuring a visually rich and seamless experience. Furthermore, our DT model is an integration of various cutting-edge technologies, including Building Information Modeling (BIM), Metadata, IoT sensor data, 360-degree images/videos, and web3D content, creating a comprehensive and interactive digital environment.

 

More information on Future of Work: Understanding the interrelationships between humans and technology to improve the quality of work-life in smart buildings.

o4a AAA Partnership Award

2022 Outstanding AAA Partnership Award of the Year

 

On behalf of COA and Live Well, Ken Wilson (COA) and Ming Tang (UC)  received the AAA Award at the o4a conference. 10.20.2022. It is my great honor to represent Live Well as the co-recipient with the Council on Aging to receive the 2022 Ohio Association of Area Agencies on Aging Annual Partnership Award. Thanks to Suzanne Burke, Ken Wilson, Jai’La Nored, Anna Goubeaux, and many others from COA. Thanks to the Live Well EVRTalk development team (Faculty: Ming Tang, Matt Anthony; advisor: Craig Vogel, Linda Dunseath; Students and Live Well fellows: Tosha Bapat, Karly Camerer, Jay Heyne, Harper Lamb, Jordan Owens, Ruby Qji, Alejandro Robledo, Matthew Spoleti, Lauren Southwood, Ryan Tinney, Keeton Yost, Dongrui Zhu.)

Link: LWC Twitter

Therapeutic Crisis Intervention Simulation

VR-based Employee Safety Training. Therapeutic Crisis Intervention Simulation 

Grant:

  1. Virtual Reality for Employee Safety Training. Phase I. Sponsored research by the Cincinnati Children’s Hospital Medical Center. PI. Ming Tang. $16,631. Period: 6.2022- 09.2022.
  2. Virtual Reality for Employee Safety Training.Therapeutic Crisis Intervention Simulation-Phase II.  Sponsored research by the Cincinnati Children’s Hospital Medical Center. PI. Tang. $22,365. Period: 2.2023- 12.2023.

Under the leadership of Ming Tang, the XR-Lab is collaborating with the Cincinnati Children’s Hospital Medical Center (CCHMC) to develop a VR-based simulation to enhance employee safety training. This initiative involves creating a virtual hospital environment with AI-controlled characters to facilitate research on diverse scenarios encountered during therapeutic crisis interventions. A vital feature of this simulation is the VR dialogue between a staff member and a teenage patient exhibiting aggressive behavior and mental illness. The primary objective is to equip staff members with the necessary skills to de-escalate tense situations effectively and adhere to appropriate protocols, thereby ensuring a safer and more controlled environment for staff and patients.

Team:

  • Ming Tang, Nancy Daraiseh, Maurizio Macaluso, Krista Keehn, Harley Davis, Aaron Vaughn, Katheryn Haller,  Joseph Staneck, Emily Oehler
  • Employee Safety Learning Lab, CCHMC
  • Extended Reality (XR) Lab, UC

Field of research: Virtual Reality, Safety Training, Therapeutic Crisis Intervention, Mental Health,  Human Behavior Simulation

screenshots from  Mobile VR Quest 2 headset.