Clients are one click away from interacting with a Digital Twin model on their personal devices. No installation is required.
The XR-Lab’s project showcases a cloud-based Digital Twin (DT) model, designed for accessibility and interaction via mobile devices. This advanced DT model allows multiple users to engage with its complex features directly through touch screens, eliminating the need for app installations. Clients can effortlessly access the content using a simple URL in a web browser on their personal iOS or Android mobile devices and tablets. The project is distinguished by its photorealistic renderings, which are streamed to clients at high frame rates, ensuring a visually rich and seamless experience. Furthermore, our DT model is an integration of various cutting-edge technologies, including Building Information Modeling (BIM), Metadata, IoT sensor data, 360-degree images/videos, and web3D content, creating a comprehensive and interactive digital environment.
Immersive vs. Traditional Training – a comparison of training modalities
PIs: Tamara Lorenz, Ming Tang
Dr. Tamara Lorenz. Associate Professor. Embodied Interactive Systems Lab, Industry 4.0 & 5.0 Institute (I45I), Center for Cognition, Action, and Perception (CAP)
Ming Tang. Professor. Extended Reality Lab, Industry 4.0 & 5.0 Institute (I45I), Institute for Research in Sensing (IRiS)
Consortium Research Project: evaluate the effectiveness of an immersive training protocol against different traditional training modalities.
Is immersive training equally as effective or better than traditional training?
Is immersive training beneficial for specific types of training (skill, behavior), while other modalities are better for other types (e.g. knowledge acquisition)?
Does the benefit of immersive VR training warrant the initial investment in equipment and subsequent investment in project building, running, and sustenance?
Proposal
Evaluation of the effectiveness of an immersive training protocol against different traditional training modalities.
Evaluation of modality-dependent benefits for different learning goals.
Derivation of assessment metrics for VR training against other training modalities.
Training scenario: DAAP Fire Evacuation
traditional training with slides and maps.
VR training with an immersive and interactive experience.
Thanks to the Institute’s Industrial Advisory Board (IAB) and industry patrons, including Siemens, Kinetic Vision, John Deere, Stress Engineering Services, Innovative Numberics, and Ethicon.
Our research team at the Live Well Collaborative created a Visual Impairment Simulation VR prototype to simulate glaucoma vision and peripheral vision loss in 2021. Glaucoma comprises a group of glaucomatous optic nerve damage and visual field loss disorders. It is a significant cause of blindness in the United States and is the most common cause of blindness among black Americans. An estimated 1 million Americans over 65 years of age have experienced the loss of vision associated with glaucoma, and approximately 75 percent of persons who are legally blind because of glaucoma are over the age of 65.[1]
A prototype of glaucoma VR simulation was developed by our team in 2021. A virtual kitchen scenario was created to allow users to experience the challenges of a visual impairment person in an immersive environment. Hand-tracking technology with Oculus Quest 2 was used for creating interactions with virtual objects.
Team: Ming Tang, Ryan Tinney, Alejandro Robledo, Tosha Bapat, Linda Dunseath, Matt Anthony @ Live Well Collaborative.
Screen recording of VR prototype glaucoma scenarios in a virtual kitchen to study cooking activities.
Hand Tracking in VR.
[1] Pizzarello LD. The dimensions of the problem of eye disease among the elderly. Ophthalmology. 1987; 94:1191–5.