Real-time Visualization & Virtual Reality & Augmented Reality
Explores the interactive virtual reality (VR) and Augmented Reality (AR) system, and real time rendering for architectural visualization, Human Computer Interaction, spatial behavioral and way-finding studies.
eXtended Reality Lab (XR-Lab) is a research lab affiliated with the College of DAAP and Digital Futures at the University of Cincinnati. XR-Lab explores how people engage with simulated scenarios through Virtual Reality, Augmented Reality, and Mixed Reality and promotes communication and collaboration through XR for education, health, safety, and professional training in both academic and industry settings.
Technologies have radically altered and reconstructed the relationships between humans, computation, and the environment. As the diagram shows, we believe AI, Digital twins, Metaverse, and XR are essential for fusing design and computation to connect humans. These technologies provide promising opportunities to empower us to rebuild a better world, physically and digitally.
We look forward to sharing our expertise and lab resources, incubating new ideas, and cultivating creativity and the digital culture of UC.
https://i1.wp.com/ming3d.com/new/wp-content/uploads/2022/12/XR_MJ.png?fit=716%2C716716716Ming Tanghttp://ming3d.com/new/wp-content/uploads/2022/01/TY_logo-300x300-new.pngMing Tang2022-12-18 19:26:462022-12-18 19:31:55new website of XR-Lab
Design for Future Service, Metaverse, and Robotics
Thanks to all students from SOD and SAID who have participated in the projects. We also want to thank the help from Kroger and UC Digital Future.
Redesign retail and other service experiences (retail space, in-person service, digital service, technology) and rethink customer/business needs. Envision the future service experience of retail, hospitality, and delivery: 1) physical, 2) digital, and 3) a combination of both. Research questions include: What does the “future retail store” look like? How the online shopping and local stores pick up change the architecture of a store? How the metaverse and immersive experience can be integrated with retail? Can public and recreational programs be introduced into stores to enhance customers’ experience? How can robots assist employees and businesses in providing better service to customers? When robots coexist with people, how can we make robots more understandable and usable?
Collaborative design work from the College of DAAP.
This four-credit workshop course includes both a seminar and project development format, and develops techniques for digital modeling as they influence the process of viewing, visualizing, and forming spatial and formal relationships. The course encourages operational connections among different techniques for inquiry and visualization as a critical methodology in the design process.
Students: Ryan Adams, Varun Bhimanpally, Ryan Carlson, Brianna Castner, Matthew Davis, John Duke, Prajakta Sanjay Gangapurkar, Jordan Gantz, Alissa Gonda, Justin Hamilton, Emma Hill, Anneke Hoskins, Philip Hummel, Jahnavi Joshi, Patrick Leesman, Tommy Lindenschmidt, Peter Loayza, Jacob Mackin, Jordan Major, Julio Martinez, Jacob McGowan, Simon Needham, Hilda Rivera, Juvita Sajan, Gavin Sharp, Hannah Webster, Megan Welch, Meghana Yelagandula, Isaiah Zuercher.
Mobile Robotics Studio envisions how robotic technology can better assist people in service environments benefiting both customers and employees. Based on human-centered design and systems thinking approaches, cross-disciplinary teams of industrial, communication, and fashion design students created design solutions for product-service systems of mobile robots in retail, air travel, sports, delivery, and emergency services.
Students: Noon Akathapon, Kai Bettermann, Cy Burkhart, Jian Cui, Joe Curtsinger, Emmy Gentile, Bradley Hickman, Quinn Matava, Colin McGrail, James McKenzie, Alex Mueller, John Pappalardo, Kurtis Rogers, Connor Rusnak, Jimmy Tran, Franklin Vallant, Leo von Boetticher
https://i2.wp.com/ming3d.com/new/wp-content/uploads/2022/11/WG_logo2.jpg?fit=1069%2C8708701069Ming Tanghttp://ming3d.com/new/wp-content/uploads/2022/01/TY_logo-300x300-new.pngMing Tang2022-11-19 18:48:572023-10-02 01:24:16Future Service, Retail, Metaverse, and Robotics
Immersive vs. Traditional Training – a comparison of training modalities
PIs: Tamara Lorenz, Ming Tang
Dr. Tamara Lorenz. Associate Professor. Embodied Interactive Systems Lab, Industry 4.0 & 5.0 Institute (I45I), Center for Cognition, Action, and Perception (CAP)
Ming Tang. Professor. Extended Reality Lab, Industry 4.0 & 5.0 Institute (I45I), Institute for Research in Sensing (IRiS)
Consortium Research Project: evaluate the effectiveness of an immersive training protocol against different traditional training modalities.
Is immersive training equally as effective or better than traditional training?
Is immersive training beneficial for specific types of training (skill, behavior), while other modalities are better for other types (e.g. knowledge acquisition)?
Does the benefit of immersive VR training warrant the initial investment in equipment and subsequent investment in project building, running, and sustenance?
Proposal
Evaluation of the effectiveness of an immersive training protocol against different traditional training modalities.
Evaluation of modality-dependent benefits for different learning goals.
Derivation of assessment metrics for VR training against other training modalities.
Training scenario: DAAP Fire Evacuation
traditional training with slides and maps.
VR training with an immersive and interactive experience.
Thanks to the Institute’s Industrial Advisory Board (IAB) and industry patrons, including Siemens, Kinetic Vision, John Deere, Stress Engineering Services, Innovative Numberics, and Ethicon.
Virtual Reality for Employee Safety Training. Phase I. Sponsored research by the Cincinnati Children’s Hospital Medical Center. PI. Ming Tang. $16,631. Period: 6.2022- 09.2022.
Virtual Reality for Employee Safety Training.Therapeutic Crisis Intervention Simulation-Phase II. Sponsored research by the Cincinnati Children’s Hospital Medical Center. PI. Tang. $22,365. Period: 2.2023- 12.2023.
Under the leadership of Ming Tang, the XR-Lab is collaborating with the Cincinnati Children’s Hospital Medical Center (CCHMC) to develop a VR-based simulation to enhance employee safety training. This initiative involves creating a virtual hospital environment with AI-controlled characters to facilitate research on diverse scenarios encountered during therapeutic crisis interventions. A vital feature of this simulation is the VR dialogue between a staff member and a teenage patient exhibiting aggressive behavior and mental illness. The primary objective is to equip staff members with the necessary skills to de-escalate tense situations effectively and adhere to appropriate protocols, thereby ensuring a safer and more controlled environment for staff and patients.
Team:
Ming Tang, Nancy Daraiseh, Maurizio Macaluso, Krista Keehn, Harley Davis, Aaron Vaughn, Katheryn Haller, Joseph Staneck, Emily Oehler
Employee Safety Learning Lab, CCHMC
Extended Reality (XR) Lab, UC
Field of research: Virtual Reality, Safety Training, Therapeutic Crisis Intervention, Mental Health, Human Behavior Simulation
Our research team at the Live Well Collaborative created a Visual Impairment Simulation VR prototype to simulate glaucoma vision and peripheral vision loss in 2021. Glaucoma comprises a group of glaucomatous optic nerve damage and visual field loss disorders. It is a significant cause of blindness in the United States and is the most common cause of blindness among black Americans. An estimated 1 million Americans over 65 years of age have experienced the loss of vision associated with glaucoma, and approximately 75 percent of persons who are legally blind because of glaucoma are over the age of 65.[1]
A prototype of glaucoma VR simulation was developed by our team in 2021. A virtual kitchen scenario was created to allow users to experience the challenges of a visual impairment person in an immersive environment. Hand-tracking technology with Oculus Quest 2 was used for creating interactions with virtual objects.
Team: Ming Tang, Ryan Tinney, Alejandro Robledo, Tosha Bapat, Linda Dunseath, Matt Anthony @ Live Well Collaborative.
Screen recording of VR prototype glaucoma scenarios in a virtual kitchen to study cooking activities.
Hand Tracking in VR.
[1] Pizzarello LD. The dimensions of the problem of eye disease among the elderly. Ophthalmology. 1987; 94:1191–5.