Posts

GRO

 Virtual Reality Training on Issues of Youth Firearm Possession.

PI. Tang. 8/5/2024-8/4/2025.

 

Funded by the God.Restoring.Order (GRO) Community, this research project will develop two VR scenarios that simulate environments designed to educate youth on applying critical skills in risky situations.

Team: Ming Tang, XR-Lab, Aaron Mallory, GRO.

XR-Lab students: Aayush Kumar, Mario Bermejo, Jonny Peng, Ahmad Alrefai, Rohit Ramesh, Charlotte Bodie 

The XR-Lab collaborated with the GRO community to leverage advanced extended reality (XR) technologies in the development of a virtual reality (VR) training application designed to strengthen the curriculum by reinforcing key competencies through immersive learning activities. In partnership, we evaluated the feasibility of integrating VR technology into the GRO training program, providing participants with an engaging narrative framework while equipping them with practical knowledge applicable to real-world contexts. The immersive VR scenarios addressed high-risk situations, including firearm possession and substance use, thereby creating a controlled environment for experiential learning and skill development.

GenAI+AR Siemens

Automatic Scene Creation for Augmented Reality Work Instructions Using Generative AI. Siemens. PI. Ming Tang. co-PI: Tianyu Jiang. $25,000. UC. 4/1/2024-12/31/2024

Students: Aayush Kumar, Mikhail Nikolaenko, Dylan Hutson.

Sponsor: Siemens through UC MME Industry 4.0/5.0 Institute

Investigate integration of LLM Gen-AI with Hololens-based training. 

P&G Metaverse

Title: Leveraging Metaverse Platforms for Enhanced Global Professional Networking – Phase 1

Ming Tang, Principal Investigator

Amount: $32,416

Funding: P&G Digital Accelerator.

Gotal: Metaverse technologies for global engagement. Human-Computer Interaction, Digital Human. Immersive visualization. 

  • P&G Team. : Elsie Urdaneta, Sam Azeba. Paula Saldarriaga, JinnyLe HongOanh
  • UC Team: Ming Tang, Students: Nathaniel Brunner, Ahmad Alrefai, Sid Urankar

In Phase 1 of the “Leveraging Metaverse Platforms for Enhanced Global Professional Networking” project, conducted in partnership with P&G, the XR-Lab proposes a pilot study aimed at identifying and analyzing the top five metaverse platforms best suited for professional networking and conferencing.  The metaverse has opened unprecedented opportunities for communication, collaboration, and social engagement, offering innovative solutions for professional networking and conferencing. In Phase 1, “Surveying the Current Metaverse Landscape,” the UC team examined the top five metaverse platforms to evaluate their potential for replicating global in-person professional networking experiences and supporting multi-user online conferences.

Restricted Content
To view this protected content, enter the password below:


Related Links:

Digital Twin, LLM & IIOT

IIOT for legacy and intelligent factory machines with XR and LLM feedback with a Digital Twin demonstration of real-time IOT for architecture/building applications using Omniverse.

  • PIs: Sam Anand, Ming Tang.
  • Students: Anuj Gautama, Mikhail Nikolaenko, Ahmad Alrefai, Aayush Kumar, Manish Raj Aryal,c, Eian Bennett, Sourabh Deshpande 

$40,000. UC Industry 4.0/5.0 Institute Consortium Research Project: 01.2024-01.2025

The project centers on the development of a Digital Twin (DT) and a multi-agent Large Language Model (LLM) framework designed to access and interpret real-time and historical data through an Industrial Internet of Things (IIoT) platform. Real-time data is sourced from legacy machines and smart machines, integrating Building Information Modeling (BIM) with environmental sensors. The multi-agent LLM framework comprises specialized agents and supports diverse user interfaces, including screen-based systems, Virtual Reality (VR) environments, and mobile devices, enabling versatile interaction, data visualization, and analysis.

The research evaluates leading DT platforms—Autodesk Tandem, NVIDIA Omniverse, and Unreal Engine—focusing on their capabilities to integrate IoT and BIM data while supporting legacy machine systems.  Autodesk Tandem excelled in seamlessly combining BIM metadata with real-time IoT streams for building operations and system scalability.  NVIDIA Omniverse demonstrated unmatched rendering fidelity and collaborative features through its Universal Scene Description (USD) framework. Unreal Engine, notable for its immersive visualization, proved superior for LLM integration, leveraging 3D avatars and conversational AI to enhance user interaction.

Read more

CVG HOLO

CVG-HOLO – WAYFINDING HOLOGRAM PROJECT

XR-Lab is working with Cincinnati/Northern Kentucky International Airport (CVG), in collaboration with UC Center for Simulations & Virtual Environments Research, to

  1. Develop and demonstrate a wayfinding hologram.
  2. Evaluate the hologram signage’s performance to augment passengers’ wayfinding experience.
  3. Develop concepts of Concourse-B store renovation, integrating emerging digital technologies related to Extended Reality
  4. Develop a digital twin model of the CVG Concourse-B store area.

The project will apply various methods, including eye-tracking, motion capture, motion tracking, and computer vision.

Hologram. Reference Image from SVG news. 10.2023

Project Client: Josh Edwards, Sr. Manager, Innovation Cincinnati/Northern Kentucky International Airport

UC Team:

  • eXtended Reality Lab: Ming Tang, Director eXtended Reality Lab Digital Futures tangmg@ucmail.uc.edu
  • UCSIM Project Lead: Chris M. Collins.  Director. Center for Simulations & Virtual Environments Research
  • ARCH 7014 students. Fall. 2023

concept of hologram in CVG. by students in ARCH 7014. Fall 2023, UC. 

Thanks for the support from the UHP Discovery Summer program. 

Check out more on the student projects and eye-tracking analysis on CVG renovation.  or way-finding research projects and publications at XR-Lab.