Virtual Reality for caregiver training

Assess the effectiveness of using Virtual Reality for caregiver training

Urban Health Pathway Seed Grant. PI: Ming Tang. Partner. Council on Ageing, LiveWell Collaborative. $19,844. 03. 2021-3.2022

Result: COA EVRTalk 

EVRTalk virtual reality caregiver training

 

This project aims to investigate the effectiveness of using Virtual Reality to build empathy for the care recipient by allowing the caregiver to experience day-to-day life from the care recipient’s perspective. Ming Tang leads a research team to work with COA and LiveWell Collaborative to develop and evaluate an expandable set of VR training modules designed to help train family and friends who are thrust into the caregiving role. Ming Tang lead the LWC team and design the simulated decision trees, scenarios, and hand-tracking technologies in an immersive VR environment.

Team members: Ming Tang, Matt Anthony,Craig Vogel, Linda Dunseath, Alejandro Robledo, Tosha Bapat, Karly Camerer, Jay Heyne, Harper Lamb, Jordan Owens, Ruby Qji, Matthew Spoleti, Lauren Southwood, Ryan Tinney, Keeton Yost, Dongrui Zhu

 

COA is awarded $25,000 from the CTA Foundation Grant in 2021.

In the UC News. share point.

paper published in IJAEC

Ming Tang (2021). “Visual Perception: Eye-tracking and Real-time Walkthroughs in Architectural Design.” International Journal of Architecture, Engineering and Construction, 10(1), 1-9.

Visual Perception: Eye-tracking and Real-time Walkthroughs in Architectural Design

This paper discusses the application of Eye Tracking (ET) technologies as a new way for researchers to understand a person’s perception of a build environment regarding wayfinding and other spatial features. This method was beneficial for informing reviewers how an existing place or a proposed design was performing in terms of user experience. Combining ET with real-time walkthrough (RTW) and analytical platform allowed designers to make real-time changes and instantly see how these choices affected a user’s visual attention and interaction. This paper also presents a study investigating the architectural features emphasizing the simulated human behavioral cues and movement information as input parameters. The research is defined as a hybrid method that seeks augmented architectural experience, wayfinding and analyzes its’ performance using ET and RTW. While presenting their concepts through RTW, students used the Tobii Pro eye tracker and analytical software to investigate the attractiveness of the proposed experience related to the five spatial features: face, edge, intensity, blue-yellow contrast, and red-green contrast. The studio projects extended psychological architecture study by exploring, collecting, analyzing, and visualizing behavioral data and using the ET analysis to optimize the design presented through walking and driving simulations. ET allowed students in the transit hub design studio to investigate various design iterations about human perception to enhance spatial organization and navigation.

Authors: Ming Tang (University of Cincinnati).
Issue: Vol 10, No 1 (2021)
Pages: 1-9
Section: Research Paper
DOI: http://dx.doi.org/10.7492/IJAEC.2021.001

This research project was conducted in fall, 2018 at the Urban Mobility Studio, supported by the UC Forward program at the University of Cincinnati. The studio re-flection and proposals are provided by the graduate students: Alan Bossman, Shreya Jasrapuria, Grant Koniski, Jianna Lee, Josiah Ebert, Taylour Upton, Kevin Xu, Yin-ing Fang, Ganesh Raman, Nicole Szparagowski, and Niloufar Kioumarsi. The thesis research was conducted by Lorrin Kline.

 

VR for Police Training

Active Shooter Simulation

Develop several fully immersive 3D VR active shooter scenarios that can run on cost-effective commercially available VR hardware.

Final Report for OCJS Project

Develop and Assess Active Shooter Virtual Reality Training for Ohio Law Enforcement.  PI: J.C Barnes. Co-PI: Tang Office of Criminal Justice Services. $50,000. 09. 2020-09.2021 ( $29,608)

Development of a Virtual Reality Augmented Violence Reduction Training System for Active and Mass Shooting incidents. PI: Ed Latessa. Co-PIs: J.C. Barnes, Ming Tang, Cory Haberman, Dan Gerard, Tim Sabransky. $10,000. Start-up fund. UC Digital Futures anchor tenant cohort.

Shimmer GSR sensor is used to test Physiological stress. 

Checklist

Using Checklists and Virtual Reality to Improve Police Investigations. Collaborative Research Advancement Grants. UC. $25,000. PI: Haberman. Co-PI: Tang, Barnes. Period: 07.2020-01.2022.

Team:

Ming Tang, Cory Haberman, J.C. Barnes, Cheryl Jonson, Dongrui Zhu, Heejin Lee, Jillian Desmond, Ruby Qiu, Snigdha Bhattiprolu, Rishyak Kommineni

Publications:

Cory P. Haberman, Ming Tang, JC Barnes, Clay Driscoll, Bradley J. O’Guinn, Calvin Proffit,. Using Virtual Reality Simulations to Study Initial Burglary Investigations. American Society of Evidence-Based Policing’s 2023 Conference. 2023. Las Vegas. Nevada.

Design Process

To create simulated human behavior, either during the active shooting, or the casual human dialogue, the team designed a A.I system to simulate the decision trees. Please watch the technique breakdown demo.

Safety Vests

Grant: Assess the effectiveness of Type 2 and Type 3 safety vests for day and night use-Phase. Ohio Department of Transportation. PI: John Ash. Co-PI: Ming Tang. Julian Wang. $337,366.31. ($191,458.16 in FY2020  and  $145,908.15  in  FY2021) Period: 02.2020-02.2022. 

Ming Tang leads the modeling team constructed the virtual reality driving simulation, and conducted eye-tracking data collection to measure driver’s perception on the construction zone and various vest, signage and vehicles.

Work zones are an essential component of any state transportation agency’s construction and maintenance operations. As such, agencies apply numerous practices to keep their workers safe during construction operations. The Ohio Department of Transportation (ODOT) recently invested in several more advanced items to improve worker safety (and traveler safety, by hopefully reducing the number of crashes overall). Specifically, ODOT invested in Type 2 and 3 safety vests, halo lights, and reflectors on the back of dump trucks. In 2020, a team of researchers from the University of Cincinnati (UC) worked with the Ohio Department of Transportation to assess the effectiveness of safety vests for day and night use.

The simulation-based evaluation used measurements to create realistic retroreflective vests, lights, and other safety equipment in virtual scenarios. These items were then placed in different virtual work zone environments, each of which had different work zone setup conditions, traffic control, vests worn by workers, time of day/ambient lighting, etc. Through an eye-tracking experiment measuring participants’ gaze on workers in different virtual work zone scenarios and a driving simulator experiment in which participants drove through virtual work zones and were asked follow-up questions on worker conspicuity, subjective and objective measures of worker visibility were obtained.

Use Virtual Reality and Eye-tracking to evaluate the safety of vest on the highway construction site.

To access copies of the final report, visit:  https://www.transportation.ohio.gov/programs/research-program/research-projects/02-research-projects  

This research was sponsored by the Ohio Department of Transportation and the Federal Highway Administration.

Protected: Computer Vision of Building Roof Analysis

This content is password protected. To view it please enter your password below: