Cloud-based Digital Twin

Clients are one click away from interacting with a Digital Twin model on their personal devices. No installation is required.

The XR-Lab’s project showcases a cloud-based Digital Twin (DT) model, designed for accessibility and interaction via mobile devices. This advanced DT model allows multiple users to engage with its complex features directly through touch screens, eliminating the need for app installations. Clients can effortlessly access the content using a simple URL in a web browser on their personal iOS or Android mobile devices and tablets. The project is distinguished by its photorealistic renderings, which are streamed to clients at high frame rates, ensuring a visually rich and seamless experience. Furthermore, our DT model is an integration of various cutting-edge technologies, including Building Information Modeling (BIM), Metadata, IoT sensor data, 360-degree images/videos, and web3D content, creating a comprehensive and interactive digital environment.

 

More information on Future of Work: Understanding the interrelationships between humans and technology to improve the quality of work-life in smart buildings.

Industry 4.0/5.0 grant

 

Immersive vs. Traditional Training​ – a comparison of training modalities​

PIs: Tamara Lorenz, Ming Tang

  • Dr. Tamara Lorenz. Associate Professor. Embodied Interactive Systems Lab, Industry 4.0 & 5.0 Institute (I45I), Center for Cognition, Action, and Perception (CAP)
  • Ming Tang. Professor. Extended Reality Lab, Industry 4.0 & 5.0 Institute (I45I), Institute for Research in Sensing (IRiS)

Consortium Research Project: evaluate the effectiveness of an immersive training protocol against different traditional training modalities. 

Grant. $40,000. By UC Industry 4.0/5.0 Institute 01.2023-01.2024

Open Questions

  • Is immersive training equally as effective or better than traditional training? 
  • Is immersive training beneficial for specific types of training (skill, behavior), while other modalities are better for other types (e.g. knowledge acquisition)?
  • Does the benefit of immersive VR training warrant the initial investment in equipment and subsequent investment in project building, running, and sustenance?

Proposal

  • Evaluation of the effectiveness of an immersive training protocol against different traditional training modalities. 
  • Evaluation of modality-dependent benefits for different learning goals. 
  • Derivation of assessment metrics for VR training against other training modalities. 

Training scenario: DAAP Fire Evacuation

traditional training with slides and maps.

VR training with an immersive and interactive experience.

 

 

Thanks to the Institute’s Industrial Advisory Board (IAB) and industry patrons, including Siemens, Kinetic Vision, John Deere, Stress Engineering Services, Innovative Numberics, and Ethicon. 

Next Phase experiments

Multi-player test



 

Links

 2017 Virtual DAAP Fire Evacuation project.

 

At UC News

New UC institute looks ahead to ‘Industry 5.0’. UC will harness collective talent across campus to help companies solve new challenges. by Michael Miller.  December 8, 2022

 

 

Therapeutic Crisis Intervention Simulation

VR-based Employee Safety Training. Therapeutic Crisis Intervention Simulation 

Grant:

  1. Virtual Reality for Employee Safety Training. Phase I. Sponsored research by the Cincinnati Children’s Hospital Medical Center. PI. Ming Tang. $16,631. Period: 6.2022- 09.2022.
  2. Virtual Reality for Employee Safety Training.Therapeutic Crisis Intervention Simulation-Phase II.  Sponsored research by the Cincinnati Children’s Hospital Medical Center. PI. Tang. $22,365. Period: 2.2023- 12.2023.

Under the leadership of Ming Tang, the XR-Lab is collaborating with the Cincinnati Children’s Hospital Medical Center (CCHMC) to develop a VR-based simulation to enhance employee safety training. This initiative involves creating a virtual hospital environment with AI-controlled characters to facilitate research on diverse scenarios encountered during therapeutic crisis interventions. A vital feature of this simulation is the VR dialogue between a staff member and a teenage patient exhibiting aggressive behavior and mental illness. The primary objective is to equip staff members with the necessary skills to de-escalate tense situations effectively and adhere to appropriate protocols, thereby ensuring a safer and more controlled environment for staff and patients.

Team:

  • Ming Tang, Nancy Daraiseh, Maurizio Macaluso, Krista Keehn, Harley Davis, Aaron Vaughn, Katheryn Haller,  Joseph Staneck, Emily Oehler
  • Employee Safety Learning Lab, CCHMC
  • Extended Reality (XR) Lab, UC

Field of research: Virtual Reality, Safety Training, Therapeutic Crisis Intervention, Mental Health,  Human Behavior Simulation

screenshots from  Mobile VR Quest 2 headset.

 

Visual Impairment Sim

Our research team at the Live Well Collaborative created a Visual Impairment Simulation VR prototype to simulate glaucoma vision and peripheral vision loss in 2021. Glaucoma comprises a group of glaucomatous optic nerve damage and visual field loss disorders. It is a significant cause of blindness in the United States and is the most common cause of blindness among black Americans. An estimated 1 million Americans over 65 years of age have experienced the loss of vision associated with glaucoma, and approximately 75 percent of persons who are legally blind because of glaucoma are over the age of 65.[1]

A prototype of glaucoma VR simulation was developed by our team in 2021.  A virtual kitchen scenario was created to allow users to experience the challenges of a visual impairment person in an immersive environment. Hand-tracking technology with Oculus Quest 2 was used for creating interactions with virtual objects.

Team: Ming Tang, Ryan Tinney, Alejandro Robledo, Tosha Bapat, Linda Dunseath,  Matt Anthony @ Live Well Collaborative

Screen recording of VR prototype glaucoma scenarios in a virtual kitchen to study cooking activities.

  

Hand Tracking in VR.

[1] Pizzarello LD. The dimensions of the problem of eye disease among the elderly. Ophthalmology. 1987; 94:1191–5.

XR Lab moved in Digital Futures

As the director of the Extended Reality lab (XR-Lab), I am thrilled to report that our XR-Lab has moved into the new Digital Futures Building at UC.

Our Lab will continue to broaden the scope of collaborations, using our expertise in both academic and professional fields in Virtual Reality, Augmented Reality, and Mixed Reality. We look forward to the long-standing collaborative relationships with faculty at the UC Digital Futures, Criminology and Justice program at CECH, Civil Engineering program, and transportation program at CEAS, Live Well Collaborative, Council on Ageing, Cincinnati Insurance Company, and Cincinnati Children’s Hospital and Medical Center.

Please visit our lab after August 2022 to check out the exciting lab space and facilities at the new UC Digital Future Building.

Location:

Room: 200 at Smart Connected Wing, & 207 VR?AR Center, Digital Future Building

3044 Reading Road, Cincinnati, OH 45206

Request a tour

XR-Lab projects

Contact:

Ming Tang, tangmg@ucmail.uc.edu

Director of Extended Reality Lab, University of Cincinnati.