GRO

Virtual Reality Training on Issues of Youth Firearm Possession.

PI. Tang. $20,000. 8/5/2024-8/4/2025.

Funded by the God.Restoring.Order (GRO) Community, this research project will develop two VR scenarios that simulate environments designed to educate youth on applying critical skills in risky situations.

Team: Ming Tang, XR-Lab, Aaron Mallory, GRO.

The XR-Lab is excited to collaborate with the GRO community to leverage cutting-edge XR technologies to develop a virtual reality (VR) training app that enhances the curriculum by reinforcing key skills through immersive VR activities. Together, we will assess the feasibility of integrating VR technology into the GRO’s training program, engaging users with a compelling narrative while equipping them with practical knowledge for real-world application.

 

        


GenAI+AR Siemens

Automatic Scene Creation for Augmented Reality Work Instructions Using Generative AI. Siemens. PI. Ming Tang. co-PI: Tianyu Jiang. $25,000. UC. 4/1/2024-12/31/2024

Sponsor: Siemens through UC MME Industry 4.0/5.0 Institute

Investigate integration of LLM Gen-AI with Hololens-based training. 

 

UC UPRISE

XR-Lab’s project was selected to join the Undergraduates Pursuing Research in Science and Engineering (UPRISE) Program
UNDERGRADUATE SUMMER RESEARCH PROGRAM IN SCIENCE AND ENGINEERING
May 6 – July 26, 2024

 

Project title: Reinforcement Learning (RL) system in Game Engine

Student: Mikhail Nikolaenko. UC. PI: Ming Tang. 

This project proposes the development of a sophisticated reinforcement learning (RL) system utilizing the robust and versatile environment of Unreal Engine 5 (UE5). The primary objective is to create a flexible and highly realistic simulation platform that can model a multitude of real-life scenarios, ranging from object recognition, urban navigation to emergency response strategies. This platform aims to significantly advance the capabilities of RL algorithms by exposing them to complex, diverse, and dynamically changing environments. Leveraging the advanced graphical and physical simulation capabilities of UE5, the project will focus on creating detailed and varied scenarios in which RL algorithms can be trained and tested. These scenarios will include, but not be limited to, urban traffic systems, natural disaster simulations, and public safety response models. The realism and intricacy of UE5’s environment will provide a challenging and rich training ground for RL models, allowing them to learn and adapt to unpredictable variables akin to those in the real world.

Synthetic material for AI training

Using a game engine to generate synthetic training data offers significant advantages for AI training in image classification, object detection, and animation classification. Synthetic data, created in controlled virtual environments, allows for generating large, diverse, and perfectly labeled datasets. This contrasts with human-labeled material, which is often expensive, time-consuming, and prone to errors. Synthetic data can be tailored to specific needs, ensuring comprehensive coverage of various scenarios and edge cases that might be underrepresented in real-world data. Additionally, synthetic environments facilitate the manipulation of variables, such as lighting and angles, providing a robust training ground for AI models. Overall, the use of synthetic material enhances the efficiency and accuracy of AI training compared to traditional human-labeled datasets.

Example

The Roof AI Training project focuses on training an AI system using Unreal Engine 5 (UE5) to generate synthetic data for machine learning applications, specifically for detecting damaged asphalt shingles on roofs. It highlights the need for massive labeled datasets to train AI effectively and discusses the role of Convolutional Neural Networks (CNNs) in image recognition tasks. UE5 is utilized to create procedurally generated roofing textures and environments, providing a cost-effective and scalable solution for data generation. The research project involves generating high-quality synthetic images of roofs, including damaged areas, to train and evaluate a CNN model. The AI aspect includes using Simplex Noise and 3-tab pattern algorithms for realistic texture generation, followed by training a CNN using PyTorch on the generated data. Preliminary results indicate that the model accurately detects most missing shingles, though further improvements are needed in bounding box precision and detection accuracy. Future goals include enhancing the procedural generation algorithm, optimizing the CNN architecture for better performance, and iteratively refining the AI through continuous training and evaluation using both synthetic and real-world data.

 

Related projectsBuilding Safety Analysis with Machine Learning

Honors Seminar student projects

“Human-Computer Interaction in the Age of Extended Reality & Metaverse” student projects

Spring. 2024.  UC

Under the guidance of Ming Tang, Director of the XR-Lab at Digital Futures and DAAP, UC, this honors seminar course has propelled students through an immersive journey into the realm of XR. The course encompasses Extended Reality, Metaverse, and Digital Twin technologies, providing a comprehensive platform for theoretical exploration and practical application in XR development.

The coursework showcases an array of student-led research projects that investigate the role of XR in various domains, including medical training, flight simulation, entertainment, tourism, cultural awareness, fitness, and music. Through these projects, students have had the opportunity to not only grasp the intricate theories underpinning future HCI developments but also to apply their skills in creating immersive experiences that hint at the future of human-technology interaction.

 

 “Human-Computer Interaction in the Age of Extended Reality & Metaverse” is a UC Honors course that delves into the burgeoning field of extended reality (XR) and its confluence with human-computer interaction (HCI), embodying a fusion of scholarly inquiry and innovative practice.

Ming Tang, Professor, Director of XR-Lab, DAAP, University of Cincinnati

Students: Nishanth Chidambaram, Bao Huynh, Caroline McCarthy, Cameron Moreland, Frank Mularcik, Cooper Pflaum, Triet Pham, Brooke Stephenson, Pranav Venkataraman

Thanks for the support from the UC Honors Program and UC Digital Futures.

Digital Twin of Cincinnati

A realtime flythrough demo for Digital Twin of City Cincinnati

Digital Futures Building at the University of Cincinnati

Destroy Alien buildings near the UC campus. Project developed by students Cooper Pflaum and Nishanth Chidambaram.