The objective of this study is to investigate the safety of roadway workers under varying environmental and work zone conditions. To achieve the objectives, a driving simulator-based experiment is proposed to evaluate drivers’ visual attention under various work zone scenarios using eye-tracking technologies.
Grant.
Using Eye- Tracking to Study the Effectiveness of Visual Communication. UHP Discovery funding. University Honor Program. UC. $5,000. Faculty advisor. 2021.
Adekunle Adebisi (Ph.D student at the College of Engineering and Applied Science) applied and received a $3,200 Emerging Fellowship Award By Academic Advisory Council for Signage Research and Education (AACSRE).
https://i1.wp.com/ming3d.com/new/wp-content/uploads/2021/08/ET_sinage.jpg?fit=662%2C455455662Ming Tanghttp://ming3d.com/new/wp-content/uploads/2022/01/TY_logo-300x300-new.pngMing Tang2021-08-28 23:47:322022-09-28 00:59:52Eye-Tracking for Drivers' Visual Behavior
Project “Stage” is a stand-alone rendering engine developed by Ming Tang, using runtime load asset, as well HDRI backdrop methods in UE4. It is a windows application for users to quickly load their FBX file into a virtual environment for first-person, and third-person walk-throughs.
This stand-alone program allows users to load external 3D models in runtime instantly. The Stage environment includes UI to load FBX models during runtime, several HDRI lighting domes, and interactive characters. The Stage promotes iterative design and encourages designers to explore the creative potentials through real-time feedback. Project bus stop and burning man were both developed using the stage as a pre-VIZ tool.
The Stage allows students to (1) present their 3D model to the reviewer without waiting for renderings. No packaging time is required. Design is ready instantly in a game environment. (2) Control an avatar to navigate space and explore the form through first-person or third-person views. (3) By loading design iterations in Stage and examining the frame rate and loading time, students learned the importance of optimizing a model. (4) Test the UV mapping, and scene hierarchy (5) Test the low-poly collision objects and navigation mesh. (5) Have fun. There is a hidden Easter egg to be discovered.
Export model from Rhino or 3dsMax as FBX, Create collision object with “UCX_” preface. Use standard material. import into Stage. Customize material. Notice. You might need to clean up your mesh model in Rhino or optimize at 3dsMax before exporting to FBX.
Tutorial 2. how the application was built in Unreal.
This project aims to investigate the effectiveness of using Virtual Reality to build empathy for the care recipient by allowing the caregiver to experience day-to-day life from the care recipient’s perspective. Ming Tang leads a research team to work with COA and LiveWell Collaborative to develop and evaluate an expandable set of VR training modules designed to help train family and friends who are thrust into the caregiving role. Ming Tang lead the LWC team and design the simulated decision trees, scenarios, and hand-tracking technologies in an immersive VR environment.
Team members: Ming Tang, Matt Anthony,Craig Vogel, Linda Dunseath, Alejandro Robledo, Tosha Bapat, Karly Camerer, Jay Heyne, Harper Lamb, Jordan Owens, Ruby Qji, Matthew Spoleti, Lauren Southwood, Ryan Tinney, Keeton Yost, Dongrui Zhu
https://i0.wp.com/ming3d.com/new/wp-content/uploads/2021/03/Tosha.jpg?fit=403%2C382382403Ming Tanghttp://ming3d.com/new/wp-content/uploads/2022/01/TY_logo-300x300-new.pngMing Tang2021-03-27 17:40:552022-11-04 20:37:17Virtual Reality for caregiver training
Develop and Assess Active Shooter Virtual Reality Training for Ohio Law Enforcement. PI: J.C Barnes. Co-PI: Tang Office of Criminal Justice Services. $50,000. 09. 2020-09.2021 ( $29,608)
Development of a Virtual Reality Augmented Violence Reduction Training System for Active and Mass Shooting incidents. PI: Ed Latessa. Co-PIs: J.C. Barnes, Ming Tang, Cory Haberman, Dan Gerard, Tim Sabransky. $10,000. Start-up fund. UC Digital Futures anchor tenant cohort.
Shimmer GSR sensor is used to test Physiological stress.
Checklist
Using Checklists and Virtual Reality to Improve Police Investigations. Collaborative Research Advancement Grants. UC. $25,000. PI: Haberman. Co-PI: Tang, Barnes. Period: 07.2020-01.2022.
To create simulated human behavior, either during the active shooting, or the casual human dialogue, the team designed a A.I system to simulate the decision trees. Please watch the technique breakdown demo.
https://i1.wp.com/ming3d.com/new/wp-content/uploads/2020/08/police.jpg?fit=1560%2C182518251560Ming Tanghttp://ming3d.com/new/wp-content/uploads/2022/01/TY_logo-300x300-new.pngMing Tang2020-08-03 22:13:332023-02-07 13:18:55VR for Police Training