Posts

GRO

Virtual Reality Training on Issues of Youth Firearm Possession.

PI. Tang. $20,000. 8/5/2024-8/4/2025.

Funded by the God.Restoring.Order (GRO) Community, this research project will develop two VR scenarios that simulate environments designed to educate youth on applying critical skills in risky situations.

Team: Ming Tang, XR-Lab, Aaron Mallory, GRO.

The XR-Lab is excited to collaborate with the GRO community to leverage cutting-edge XR technologies to develop a virtual reality (VR) training app that enhances the curriculum by reinforcing key skills through immersive VR activities. Together, we will assess the feasibility of integrating VR technology into the GRO’s training program, engaging users with a compelling narrative while equipping them with practical knowledge for real-world application.

 

        


GenAI+AR Siemens

Automatic Scene Creation for Augmented Reality Work Instructions Using Generative AI. Siemens. PI. Ming Tang. co-PI: Tianyu Jiang. $25,000. UC. 4/1/2024-12/31/2024

Sponsor: Siemens through UC MME Industry 4.0/5.0 Institute

Investigate integration of LLM Gen-AI with Hololens-based training. 

 

P&G Metaverse

Title: Leveraging Metaverse Platforms for Enhanced Global Professional Networking – Phase 1

Ming Tang, Principal Investigator

Amount: $32,416

Funding: P&G Digital Accelerator.

Gotal: Metaverse technologies for global engagement. Human-Computer Interaction, Digital Human. Immersive visualization. 

  • P&G Team. : Elsie Urdaneta, Sam Azeba. Paula Saldarriaga, JinnyLe HongOanh
  • UC Team: Ming Tang, Students: Nathaniel Brunner, Ahmad Alrefai, Sid Urankar

 

 

Digital Twin, LLM & IIOT

IIOT for legacy and intelligent factory machines with XR and LLM feedback with a Digital Twin demonstration of real-time IOT for architecture/building applications using Omniverse.

  • PIs: Sam AnandMing Tang

$40,000. UC Industry 4.0/5.0 Institute Consortium Research Project: 01.2024-01.2025

The project centers on the development of a Digital Twin (DT) and a multi-agent Large Language Model (LLM) framework designed to access and interpret real-time and historical data through an Industrial Internet of Things (IIoT) platform. Real-time data is sourced from legacy machines and smart machines, integrating Building Information Modeling (BIM) with environmental sensors. The multi-agent LLM framework comprises specialized agents and supports diverse user interfaces, including screen-based systems, Virtual Reality (VR) environments, and mobile devices, enabling versatile interaction, data visualization, and analysis.

Environment Sensors for Digital Twin model. XR-Lab and SM-Lab at Digital Futures Building.

Integration of Reality capture, IOT, LLM into a digital twin model.  

 

Digital Twin of Digital Futures Building. 

Primary Objective: To develop a conversational large language modeling system that acquires data from legacy machines, digital machines, environmental data, real-time data, and historical data within an IIoT environment to create a digital twin for assisting in real-time maintenance and assistance (Application Use Case: Digital Future’s Building) 


1. Autodesk DT

Live Demo with Autodesk DT.  Please get in touch with Ming Tang to request a password. 

  


2. Unreal + Omniverse DT


Demonstration videos

 

The original BIM model was provided by GBBN and Messer Construction. Copyright 2024.

Student: Sourabh Deshpande, Anuj Gautam , Manish Raj Aryal, Mikhail Nikolaenko, Aayush Kumar, Eian Bennett, Ahmad Alrefai

CVG HOLO

CVG-HOLO – WAYFINDING HOLOGRAM PROJECT

XR-Lab is working with Cincinnati/Northern Kentucky International Airport (CVG), in collaboration with UC Center for Simulations & Virtual Environments Research, to

  1. Develop and demonstrate a wayfinding hologram.
  2. Evaluate the hologram signage’s performance to augment passengers’ wayfinding experience.
  3. Develop concepts of Concourse-B store renovation, integrating emerging digital technologies related to Extended Reality
  4. Develop a digital twin model of the CVG Concourse-B store area.

The project will apply various methods, including eye-tracking, motion capture, motion tracking, and computer vision.

Hologram. Reference Image from SVG news. 10.2023

Project Client: Josh Edwards, Sr. Manager, Innovation Cincinnati/Northern Kentucky International Airport

UC Team:

  • eXtended Reality Lab: Ming Tang, Director eXtended Reality Lab Digital Futures tangmg@ucmail.uc.edu
  • UCSIM Project Lead: Chris M. Collins.  Director. Center for Simulations & Virtual Environments Research
  • ARCH 7014 students. Fall. 2023

concept of hologram in CVG. by students in ARCH 7014. Fall 2023, UC. 

Thanks for the support from the UHP Discovery Summer program. 

Check out more on the student projects and eye-tracking analysis on CVG renovation.  or way-finding research projects and publications at XR-Lab.