Posts

SMART-DT

SMART-DT: Scalable Multi-Agent Reinforcement Learning and Collaborative AI for Digital Twin Platform of Infrastructure and Facility Operations.

Principal Investigators:

  • Prof. Sam Anand, Department of Mechanical Engineering, CEAS
  • Prof. Ming Tang, Extended Reality Lab, Digital Futures, DAAP

Grant: $40,000. UC Industry 4.0/5.0 Institute Consortium Research Project: 03.2025-01.2026

In this phase, we will develop a scalable digital twin that integrates machine, factory, and city-level data with AI-driven real-time decision-making. The key questions we aim to answer are:

  • Can a high-fidelity Digital Twin (DT) be efficiently built using only image and video data?
  • How can multiple specialized Large Language Model (LLM) agents—at machine, factory, and city levels—collaborate to generate relevant insights?
  • How effective is synthetic data from a Digital Twin for object detection and process recognition?
  • Does combining traditional Machine Learning (ML) with Large Language Models (LLMs) improve decision-making in complex manufacturing operations?

The project’s primary goal is to create a scalable, cloud-based digital twin that enhances operational efficiency through AI-driven insights. Additional technical objectives include:

  • Using advanced reality capture techniques (e.g., Gaussian Splatting) to build a Digital Twin from images and videos and simulate fault scenarios at factory and data center levels.
  • Integrating an IIoT data framework to track material flow, process handling, operational metrics, and equipment status for seamless cloud-based analysis.
  • Developing a synthetic data capture system using a simulated drone within the Digital Twin to train reinforcement learning models for fault prediction.
  • Designing a multi-agent AI system combining LLMs, machine learning, and reinforcement learning to enable dynamic communication, prediction, and diagnostics in the factory.

 

last year’s project: IIOT for legacy and intelligent factory machines with XR and LLM feedback with a Digital Twin demonstration of real-time IOT for architecture/building applications using Omniverse.

P&G VISM

Project VISM (Virtual Interactive Simulation for Material customization)
Interactive Visualization with User-Controlled, Procedural-Based, and Physical-Based Material Customization.

PI. Ming Tang.

Sponsor: P&G. 12/01/2024 – 5/31/2025

 

 

 

GRO

Virtual Reality Training on Issues of Youth Firearm Possession.

PI. Tang. 8/5/2024-8/4/2025.

 

Funded by the God.Restoring.Order (GRO) Community, this research project will develop two VR scenarios that simulate environments designed to educate youth on applying critical skills in risky situations.

Team: Ming Tang, XR-Lab, Aaron Mallory, GRO.

XR-Lab students: Aayush Kumar, Mario Bermejo, Jonny Peng, Amad Alrefai

The XR-Lab is excited to collaborate with the GRO community to leverage cutting-edge XR technologies to develop a virtual reality (VR) training app that enhances the curriculum by reinforcing key skills through immersive VR activities. Together, we will assess the feasibility of integrating VR technology into the GRO’s training program, engaging users with a compelling narrative while equipping them with practical knowledge for real-world application.

 

 

 

GenAI+AR Siemens

Automatic Scene Creation for Augmented Reality Work Instructions Using Generative AI. Siemens. PI. Ming Tang. co-PI: Tianyu Jiang. $25,000. UC. 4/1/2024-12/31/2024

Students: Aayush Kumar, Mikhail Nikolaenko, Dylan Hutson.

Sponsor: Siemens through UC MME Industry 4.0/5.0 Institute

Investigate integration of LLM Gen-AI with Hololens-based training. 

P&G Metaverse

Title: Leveraging Metaverse Platforms for Enhanced Global Professional Networking – Phase 1

Ming Tang, Principal Investigator

Amount: $32,416

Funding: P&G Digital Accelerator.

Gotal: Metaverse technologies for global engagement. Human-Computer Interaction, Digital Human. Immersive visualization. 

  • P&G Team. : Elsie Urdaneta, Sam Azeba. Paula Saldarriaga, JinnyLe HongOanh
  • UC Team: Ming Tang, Students: Nathaniel Brunner, Ahmad Alrefai, Sid Urankar

In Phase 1 of the “Leveraging Metaverse Platforms for Enhanced Global Professional Networking” project, conducted in partnership with P&G, the XR-Lab proposes a pilot study aimed at identifying and analyzing the top five metaverse platforms best suited for professional networking and conferencing. 

Example of XR-Lab in Hyperspace