Posts

Digital Twin of Cincinnati

A realtime flythrough demo for Digital Twin of City Cincinnati

Digital Futures Building at the University of Cincinnati

Destroy Alien buildings near the UC campus. Project developed by students Cooper Pflaum and Nishanth Chidambaram. 

Digital Twin, LLM & IIOT

IIOT for legacy and intelligent factory machines with AR and LLM feedback with a Digital Twin demonstration of real-time IOT for architecture/building applications using Omniverse.

  • PI: Prof. Sam Anand (Director of Smart-Manufacturing Lab, Dept. of Mechanical Engineering, CEAS)
  • co-PI:  Prof. Ming Tang (Director of XR-Lab, School of Architecture & Interior Design, DAAP)

$40,000. UC Industry 4.0/5.0 Institute Consortium Research Project: 01.2024-01.2025

Primary Objective: To develop a conversational large language modeling system that acquires data from legacy machines, digital machines, environmental data, real-time data, and historical data within an IIoT environment to create a digital twin for assisting in real-time maintenance and assistance (Application Use Case: Digital Future’s Building) 

Student: Sourabh Deshpande, Anuj Gautam , Manish Raj Aryal, Mikhail Nikolaenko, Aayush Kumar, Eian Bennett

 

Cloud-based Digital Twin

Clients are one click away from interacting with a Digital Twin model on their personal devices. No installation is required.

The XR-Lab’s project showcases a cloud-based Digital Twin (DT) model, designed for accessibility and interaction via mobile devices. This advanced DT model allows multiple users to engage with its complex features directly through touch screens, eliminating the need for app installations. Clients can effortlessly access the content using a simple URL in a web browser on their personal iOS or Android mobile devices and tablets. The project is distinguished by its photorealistic renderings, which are streamed to clients at high frame rates, ensuring a visually rich and seamless experience. Furthermore, our DT model is an integration of various cutting-edge technologies, including Building Information Modeling (BIM), Metadata, IoT sensor data, 360-degree images/videos, and web3D content, creating a comprehensive and interactive digital environment.

 

More information on Future of Work: Understanding the interrelationships between humans and technology to improve the quality of work-life in smart buildings.

Industry 4.0/5.0 grant

 

Immersive vs. Traditional Training​ – a comparison of training modalities​

PIs: Tamara Lorenz, Ming Tang

  • Dr. Tamara Lorenz. Associate Professor. Embodied Interactive Systems Lab, Industry 4.0 & 5.0 Institute (I45I), Center for Cognition, Action, and Perception (CAP)
  • Ming Tang. Professor. Extended Reality Lab, Industry 4.0 & 5.0 Institute (I45I), Institute for Research in Sensing (IRiS)

Consortium Research Project: evaluate the effectiveness of an immersive training protocol against different traditional training modalities. 

Grant. $40,000. By UC Industry 4.0/5.0 Institute 01.2023-01.2024

Open Questions

  • Is immersive training equally as effective or better than traditional training? 
  • Is immersive training beneficial for specific types of training (skill, behavior), while other modalities are better for other types (e.g. knowledge acquisition)?
  • Does the benefit of immersive VR training warrant the initial investment in equipment and subsequent investment in project building, running, and sustenance?

Proposal

  • Evaluation of the effectiveness of an immersive training protocol against different traditional training modalities. 
  • Evaluation of modality-dependent benefits for different learning goals. 
  • Derivation of assessment metrics for VR training against other training modalities. 

Training scenario: DAAP Fire Evacuation

traditional training with slides and maps.

VR training with an immersive and interactive experience.

 

 

Thanks to the Institute’s Industrial Advisory Board (IAB) and industry patrons, including Siemens, Kinetic Vision, John Deere, Stress Engineering Services, Innovative Numberics, and Ethicon. 

Next Phase experiments

Multi-player test



 

Links

 2017 Virtual DAAP Fire Evacuation project.

 

At UC News

New UC institute looks ahead to ‘Industry 5.0’. UC will harness collective talent across campus to help companies solve new challenges. by Michael Miller.  December 8, 2022

 

 

NSF: Future of Work

Ming Tang worked as a co-investigator on the project funded by the NSF Grant. 

Future of Work: Understanding the interrelationships between humans and technology to improve the quality of work-life in smart buildings.

Grant: #SES-2026594 PI:  David W. Wendell. co-PIs: Harfmann, Anton; Fry, Michael; Rebola, Claudia; co-Is: Pravin Bhiwapurkar, Ann Black, Annulla Linders, Tamara Lorenz, Nabil Nassif, John Seibert, Ming Tang, Nicholas Williams, and Danny T.Y. Wu.  01-01-2021 -12-31-2021 National Science Foundation $149,720. Awarded Level: Federal 

 

The primary goal of this proposed planning project is to assemble a diverse, multidisciplinary team of experts dedicated to devising a robust methodology for the collection, analysis, and correlation of existing discipline-specific studies and data. This endeavor focuses on buildings and their occupants, aiming to unearth previously undiscovered interactions. Our research will specifically delve into the intricate interrelationships between four key areas: 1) the overall performance of buildings, 2) the indoor and outdoor environmental conditions, 3) the physical health of the occupants, and 4) their satisfaction with the work environment. This comprehensive approach is designed to provide a holistic understanding of the dynamic between buildings and the well-being of the individuals within them.

 

Prof. Anton Harfmann developed the sensor towers.

 

Ming Tang spearheaded the development of a Digital Twin model, an innovative project integrating multiple historical sensor data sets into a comprehensive, interactive 3D model. This model encompasses several vital features: the capture, analysis, and visualization of historical data; cloud-based data distribution; seamless integration with Building Information Models (BIM); and an intuitive Web User Experience (UX). Building elements are extracted as metadata from the BIM model and then overlaid in screen-based and Virtual Reality (VR) interfaces, offering a multi-dimensional data view. Further details are available at the Cloud-based Digital Twin project for a more in-depth exploration of this work.

 

See more details on the Digital Twin workflow.