This is a collective project with 35 students from the VIZ-III course to create a metaverse collectively. Bus stops were designed in Rhino, 3dsmax, Revit, or Sketch-up and brought into Unreal. Please download the demo zip file ( 800MB) here.
https://i1.wp.com/ming3d.com/new/wp-content/uploads/2020/11/ARCH7014_P2_SM.jpg?fit=1000%2C100010001000Ming Tanghttp://ming3d.com/new/wp-content/uploads/2022/01/TY_logo-300x300-new.pngMing Tang2020-11-01 21:12:102022-11-18 01:58:07bus stops in Metaverse
Develop and Assess Active Shooter Virtual Reality Training for Ohio Law Enforcement. PI: J.C Barnes. Co-PI: Tang Office of Criminal Justice Services. $50,000. 09. 2020-09.2021 ( $29,608)
Development of a Virtual Reality Augmented Violence Reduction Training System for Active and Mass Shooting incidents. PI: Ed Latessa. Co-PIs: J.C. Barnes, Ming Tang, Cory Haberman, Dan Gerard, Tim Sabransky. $10,000. Start-up fund. UC Digital Futures anchor tenant cohort.
Shimmer GSR sensor is used to test Physiological stress.
Checklist
Using Checklists and Virtual Reality to Improve Police Investigations. Collaborative Research Advancement Grants. UC. $25,000. PI: Haberman. Co-PI: Tang, Barnes. Period: 07.2020-01.2022.
To create simulated human behavior, either during the active shooting, or the casual human dialogue, the team designed a A.I system to simulate the decision trees. Please watch the technique breakdown demo.
https://i1.wp.com/ming3d.com/new/wp-content/uploads/2020/08/police.jpg?fit=1560%2C182518251560Ming Tanghttp://ming3d.com/new/wp-content/uploads/2022/01/TY_logo-300x300-new.pngMing Tang2020-08-03 22:13:332023-02-07 13:18:55VR for Police Training
Grant: Assess the effectiveness of Type 2 and Type 3 safety vests for day and night use-Phase. Ohio Department of Transportation. PI: John Ash. Co-PI: Ming Tang. Julian Wang. $337,366.31. ($191,458.16 in FY2020 and $145,908.15 in FY2021) Period: 02.2020-02.2022.
Ming Tang leads the modeling team constructed the virtual reality driving simulation, and conducted eye-tracking data collection to measure driver’s perception on the construction zone and various vest, signage and vehicles.
Work zones are an essential component of any state transportation agency’s construction and maintenance operations. As such, agencies apply numerous practices to keep their workers safe during construction operations. The Ohio Department of Transportation (ODOT) recently invested in several more advanced items to improve worker safety (and traveler safety, by hopefully reducing the number of crashes overall). Specifically, ODOT invested in Type 2 and 3 safety vests, halo lights, and reflectors on the back of dump trucks. In 2020, a team of researchers from the University of Cincinnati (UC) worked with the Ohio Department of Transportation to assess the effectiveness of safety vests for day and night use.
The simulation-based evaluation used measurements to create realistic retroreflective vests, lights, and other safety equipment in virtual scenarios. These items were then placed in different virtual work zone environments, each of which had different work zone setup conditions, traffic control, vests worn by workers, time of day/ambient lighting, etc. Through an eye-tracking experiment measuring participants’ gaze on workers in different virtual work zone scenarios and a driving simulator experiment in which participants drove through virtual work zones and were asked follow-up questions on worker conspicuity, subjective and objective measures of worker visibility were obtained.
Use Virtual Reality and Eye-tracking to evaluate the safety of vest on the highway construction site.
ARCH 4002. Spring 2020
SAID DAAP, University of Cincinnati
Using the 10 miles rural area along I-90 at Lorain County, Ohio as the site, this Rural Mobile Living studio presents a study investigating the rural mobility with an emphasis on architecture as infrastructure and its connection to the means of transportation. Work closely with the Vehicle Design Studio in the School of Design, the research intended to realize the potential of the self-driving car, smart technology, artificial intelligence, machine learning into the architectural design process and address problems such as poverty, lack of transportation means and under-developed infrastructure. Ultimately, the studio looks to build upon the strengths of both vehicle design and architecture methods and explore the possible design solutions for the following five scenarios in the rural areas: “shared living, working homeless, digital nomad, disaster relief, and tourism recreation.
Faculty: Ming Tang
Students: Nick Chism, Maddie Cooke, Amy Cui, Noah Nicolette, Travis Rebsch, Vu Tran Huy Phi, Kristian Van Wiel, David Wade, Jamie Waugaman, Adam Baca. SAID, DAAP.
Add Ming Tang as your friend. Friend code “301687106”
B. Join a multiplayer session.
Make sure you use “Internet”, not “Lan”. Single-click the found session, not double click.
You should be able to use “Shift + Tab” to turn on the Steam overlay. Ask questions in the Steam chat room.
Choose the “Find games” option. Once you find an open session, double click the name to join the game.
https://i0.wp.com/ming3d.com/new/wp-content/uploads/2020/04/cover_square.jpg?fit=785%2C696696785Ming Tanghttp://ming3d.com/new/wp-content/uploads/2022/01/TY_logo-300x300-new.pngMing Tang2020-04-13 21:51:352022-09-28 01:02:40Rural Mobile Living
Signs, in all their forms and manifestations, provide visual communication for wayfinding, commerce, and public dialogue and expression. Yet, how effectively a sign communicates and ultimately elicits a desired reaction begins with how well it attracts the visual attention of prospective viewers. This is especially the case for complex visual environments, both outside and inside of buildings. This paper presents the results of an exploratory research design to assess the use of eye-tracking (ET) technology to explore how placement and context affect the capture of visual attention. Specifically, this research explores the use of ET hardware and software in real-world contexts to analyze how visual attention is impacted by location and proximity to geometric edges, as well as elements of contrast, intensity against context, and facial features. Researchers also used data visualization and interpretation tools in augmented reality environments to anticipate human responses to alternative placement and design. Results show that ET methods, supported by the screen-based and wearable eye-tracking technologies, can provide results that are consistent with previous research of signage performance using static images in terms of cognitive load and legibility, and ET technologies offer an advanced dynamic tool for the design and placement of signage.
ACKNOWLEDGMENT
The research project is supported by the Strategic Collaborative/Interdisciplinary Award of the University of Cincinnati. Thanks to the support from Professor Christopher Auffrey, students from ARCH7014, Fall 2019 semester, ARCH8001 Spring 2019 semester, and ARCH4001, Fall 2018 semester at the University of Cincinnati.
For more information on the wearable ET, screen-based ET, and VR-ET, please check out our research website, or contact Prof. Tang.
https://i1.wp.com/ming3d.com/new/wp-content/uploads/2020/02/IJSW.jpg?fit=754%2C714714754Ming Tanghttp://ming3d.com/new/wp-content/uploads/2022/01/TY_logo-300x300-new.pngMing Tang2020-02-08 16:02:182022-09-28 01:02:51article in IJSW journal