Posts

SMAT: Scalable Multi-Agent AI for DT

SMAT: Scalable Multi-Agent Machine Learning and Collaborative AI for Digital Twin Platform of Infrastructure and Facility Operations.

Principal Investigators:

  • Prof. Sam Anand, Department of Mechanical Engineering, CEAS
  • Prof. Ming Tang, Extended Reality Lab, Digital Futures, DAAP

Students: Anuj Gautam, Manish Aryal, Aayush Kumar, Ahmad Alrefai, Rohit Ramesh, Mikhail Nikolaenko, Bozhi Peng

Grant: $40,000. UC Industry 4.0/5.0 Institute Consortium Research Project: 03.2025-01.2026

Read more

P&G VISM

Project VISM (Virtual Interactive Simulation for Material customization)
Interactive Visualization with User-Controlled, Procedural-Based, and Physical-Based Material Customization.

PI. Ming Tang.

P&G Team: Kim Jackson, Andrew Fite, Fei Wang, Allison Roman

UC Team: Ming Tang, Aayush Kumar, Yuki Hirota

Sponsor: P&G. 12/01/2024 – 5/31/2025

Amount: $28,350


Restricted Content
To view this protected content, enter the password below:

 


 

VR Training on Issues of Youth Firearm Possession

 Virtual Reality Training on Issues of Youth Firearm Possession.

PI. Tang. 8/5/2024-8/4/2025.

 

Funded by the God.Restoring.Order (GRO) Community, this research project will develop several VR scenarios that simulate environments designed to educate youth on applying critical skills in risky situations.

Team: Ming Tang, XR-Lab, Aaron Mallory, GRO.

XR-Lab students: Aayush Kumar, Mario Bermejo, Jonny Peng, Ahmad Alrefai, Rohit Ramesh, Charlotte Bodie 

The XR-Lab collaborated with the GRO community to leverage advanced extended reality (XR) technologies in the development of a virtual reality (VR) training application designed to strengthen the curriculum by reinforcing key competencies through immersive learning activities. In partnership, we evaluated the feasibility of integrating VR technology into the GRO training program, providing participants with an engaging narrative framework while equipping them with practical knowledge applicable to real-world contexts. The immersive VR scenarios addressed high-risk situations, including firearm possession and substance use, thereby creating a controlled environment for experiential learning and skill development.

The XR-Lab has harnessed advanced motion capture technology in this project to translate the movements of real people into lifelike digital characters. Every gesture, shift in posture, and subtle facial expression is carefully recorded and mapped to ensure authenticity and emotional depth in the virtual environment.

Our development team has worked closely and continuously with the GRO community, engaging in multiple motion studies, rehearsals, and testing sessions. This collaboration allows us to capture not just movement, but the nuance behind each action — the personality conveyed through body language, and the emotional context embedded in facial expression.

Through this process, the digital characters become more than avatars; they become authentic extensions of human experience, reflecting the stories. The result is an immersive, emotionally resonant experience where technology and humanity move together. 

 

GenAI+AR Siemens

Automatic Scene Creation for Augmented Reality Work Instructions Using Generative AI. Siemens. PI. Ming Tang. co-PI: Tianyu Jiang. $25,000. UC. 4/1/2024-12/31/2024

Students: Aayush Kumar, Mikhail Nikolaenko, Dylan Hutson.

Sponsor: Siemens through UC MME Industry 4.0/5.0 Institute

Investigate integration of LLM Gen-AI with Hololens-based training. 

P&G Metaverse

Title: Leveraging Metaverse Platforms for Enhanced Global Professional Networking – Phase 1

Ming Tang, Principal Investigator

Amount: $32,416

Funding: P&G Digital Accelerator.

Gotal: Metaverse technologies for global engagement. Human-Computer Interaction, Digital Human. Immersive visualization. 

  • P&G Team. : Elsie Urdaneta, Sam Azeba. Paula Saldarriaga, JinnyLe HongOanh
  • UC Team: Ming Tang, Students: Nathaniel Brunner, Ahmad Alrefai, Sid Urankar

In Phase 1 of the “Leveraging Metaverse Platforms for Enhanced Global Professional Networking” project, conducted in partnership with P&G, the XR-Lab proposes a pilot study aimed at identifying and analyzing the top five metaverse platforms best suited for professional networking and conferencing.  The metaverse has opened unprecedented opportunities for communication, collaboration, and social engagement, offering innovative solutions for professional networking and conferencing. In Phase 1, “Surveying the Current Metaverse Landscape,” the UC team examined the top five metaverse platforms to evaluate their potential for replicating global in-person professional networking experiences and supporting multi-user online conferences.

This phase provided a systematic assessment of the leading platforms suitable for professional networking and conferencing. The team analyzed how these platforms leverage emerging technologies to create immersive experiences, such as hosting online conferences and facilitating discussions with global experts. The Phase 1 final report delivered a comprehensive evaluation of the strengths and limitations of each platform, along with recommendations for platform selection based on specific criteria such as audience size, interactivity, and technical requirements.

Additionally, the UC team demonstrated how platforms such as Mesh and Hyperspace can be customized for specific events, including tailored branding and interactive features designed to enhance professional networking and conferencing. These demonstrations provide valuable insights for P&G, highlighting how metaverse technologies can be strategically developed and adopted to expand global engagement and enable innovative event experiences.

Customized Metaverse Demo developed by the UC Team.

Please reach out to the P&G Team for the final report. 


Related Links: