4oA conference & demo
We presented “Leverage technology to enhance the future of aging” and demonstrated the EVRTALK project at the O4A 2022 conference in Columbus, Ohio, on 10/20. 

We presented “Leverage technology to enhance the future of aging” and demonstrated the EVRTALK project at the O4A 2022 conference in Columbus, Ohio, on 10/20. 

Grant:
Under the leadership of Ming Tang, the XR-Lab is collaborating with the Cincinnati Children’s Hospital Medical Center (CCHMC) to develop a VR-based simulation to enhance employee safety training. This initiative involves creating a virtual hospital environment with interactive characters to facilitate research on diverse scenarios encountered during therapeutic crisis interventions. A vital feature of this simulation is the VR dialogue between a staff member and a teenage patient exhibiting aggressive behavior and mental illness. The primary objective is to equip staff members with the necessary skills to de-escalate tense situations effectively and adhere to appropriate protocols, thereby ensuring a safer and more controlled environment for staff and patients.
Team:
Field of research: Virtual Reality, Safety Training, Therapeutic Crisis Intervention, Mental Health, Human Behavior Simulation

screenshots from Mobile VR Quest 2 headset.



Our research team at the Live Well Collaborative created a Visual Impairment Simulation VR prototype to simulate glaucoma vision and peripheral vision loss in 2021. Glaucoma comprises a group of glaucomatous optic nerve damage and visual field loss disorders. It is a significant cause of blindness in the United States and is the most common cause of blindness among black Americans. An estimated 1 million Americans over 65 years of age have experienced the loss of vision associated with glaucoma, and approximately 75 percent of persons who are legally blind because of glaucoma are over the age of 65.[1]
A prototype of glaucoma VR simulation was developed by our team in 2021. A virtual kitchen scenario was created to allow users to experience the challenges of a visual impairment person in an immersive environment. Hand-tracking technology with Oculus Quest 2 was used for creating interactions with virtual objects.
Team: Ming Tang, Ryan Tinney, Alejandro Robledo, Tosha Bapat, Linda Dunseath, Matt Anthony @ Live Well Collaborative.

Screen recording of VR prototype glaucoma scenarios in a virtual kitchen to study cooking activities.

Hand Tracking in VR.
[1] Pizzarello LD. The dimensions of the problem of eye disease among the elderly. Ophthalmology. 1987; 94:1191–5.
As the director of the Extended Reality lab (XR-Lab), I am thrilled to report that our XR-Lab has moved into the new Digital Futures Building at UC.
Our Lab will continue to broaden the scope of collaborations, using our expertise in both academic and professional fields in Virtual Reality, Augmented Reality, and Mixed Reality. We look forward to the long-standing collaborative relationships with faculty at the UC Digital Futures, Criminology and Justice program at CECH, Civil Engineering program, and transportation program at CEAS, Live Well Collaborative, Council on Ageing, Cincinnati Insurance Company, and Cincinnati Children’s Hospital and Medical Center.
Please visit our lab after August 2022 to check out the exciting lab space and facilities at the new UC Digital Future Building.

Location:
Room: 200 at Smart Connected Wing, & 207 VR?AR Center, Digital Future Building
3044 Reading Road, Cincinnati, OH 45206

Contact:
Ming Tang, tangmg@ucmail.uc.edu
Director of Extended Reality Lab, University of Cincinnati.
Instructor: Ming Tang. Director, Extended Reality Lab. XR-Lab, Associate Professor, SAID, DAAP, University of Cincinnati
This seminar course focuses on the intersection of architecture design, and interior design with immersive visualization technologies, including Virtual Reality, Augmented Reality, Digital Twin, social VR, and real-time simulation. The class will explore the new spatial experience in the virtual realm and analyze human perceptions through hand-tracking, body-tracking, haptic simulation, and various sensory inputs. Students will learn both the theoretical framework and hands-on skills on XR development. The course will provide students exposure to the Oculus Quest, Teslasuit, Hololens technologies, and wearable sensors. Students are encouraged to propose their own or group research on the subject of future design with XR.
Hardware: Oculus Quest, and Hololens were provided by the course.

Student Research Project
Digital Twin

AR for community engagement. Price Hill

References:
Recommended podcast on Metaverse
