Training Simulation for ODOT Snow and Ice Drivers – Phase 1, 2

Grant supported by the Ohio Department of Transportation.

PI: Jiaqi Ma. Co-PI: Ming Tang. Julian Wang.

Evaluate Opportunities to Provide Training Simulation for ODOT Snow and Ice Drivers – Phase 1

8/2018 – 3/2019

The purpose of this research is to conduct an in-depth analysis of ODOT’s current process for training snow and ice drivers and provide recommendations on how to enhance their snow and ice experience.

To accomplish this research, the scope of work should be divided into two phases. The scope of work should include, at a minimum, the activities noted below. Additional tasks may be included in the proposal by the UC team as appropriate to ensure achievement of research objectives. ODOT’s decision to invest in Phase 2 will be based on Phase 1 interim report and recommendations, and it will consider both ODOT’s ability to implement and the expected cost-benefit.

The first phase of the research requires a comprehensive look at how ODOT currently trains their snow and ice drivers and a review of nationwide practices for snow and ice driver training. An analysis of the past practices shall be considered. During Phase 1, the UC team will work closely with ODOT Lorain County personnel. The recommendations will be developed by reviewing and documenting ODOT’s current practices, past practices, and then developing a matrix of choices and opportunities to enhance ODOT’s travel time recovery.

In order to understand the available technologies for training snow and ice drivers, we performed a preliminary literature search and found the key technologies can be categorized into driving simulator-based, VR-based, AR-based type. In the following section, we will discuss these three simulation training types, and some previous associated works conducted by PIs are also provided.

“Evaluate Opportunities to Provide Training Simulation for ODOT Snow and Ice Drivers”.

Funded by Ohio Department of Transportation

PI: Jiaqi Ma. Co-PI: Ming Tang, Julian Wang.
Phase I. $39,249.ODOT 32391 / FHWA. 2018-2019
Phase II: $952,938. Grant #: 1014440. ODOT 32391 / FHWA
. 2019-2011

SAID students: Sam Dezarn, Ganesh Raman, Dongrui Zhu, Jordan Sauer, Niloufar Kioumarsi.

Download the demo file. zip (1GB)

instruction: “C” switch camera view. “E” get in/out car, “L” trun on/off light. “space bar” break. “W”, “A”, “S”, “D” for navigation.

Gamepad: Right Trigger, to start engine,  Left Trigger tostop engine.

 

Magic School Bus project

The AR & VR project for medical model. Animated heart. magic school bus project at the University of Cincinnati.

Eye-tracking analysis for train station

Some test did with Tobii eye tracker for train station design in Beijing. All heatmap here, all gaze cluster here. 15 train stations were design by students and presented through Unreal Game Engine. More information on the train station design is available in the studio website.

ARCH4002__screenshots__all_sreenshots_Page_49.png__Full_recordings__All_Participants

 

ARCH4002__screenshots__all_sreenshots_Page_30.png__Full_recordings__All_Participants

Hololens Installation

Two installations using Microsoft Hololens

Bubble

MOMO

 

 

attention measurment of construction sequence

use eye tracking to measure the attention and gaze of a viewer on construction sequence. The construction was made in Prof. Tang’s second year SAID class.

 

eye-tracking for way-finding

study on eye-tracking for way-finding. The building on fire is  UC DAAP building. The color represent fixation duration, and gaze count.

 

gaze

 

UI for mobile devices MSB project

This is a prototype page for UI designed for MSB project usign mobile devices VR and AR. The idea is developing mobile based AR tracking system, similar as Merge Cube or HP Reveal

VR medical model

The VR project for medical model. Animated heart. magic school bus project at the University of Cincinnati.

Virtual Assistant for Boeing is shortlisted for 2018 Crystal Cabin Award

Our project has been shortlisted to for a 2018 Crystal Cabin Award! Students and faculty at the University of Cincinnati and the Live Well Collaborative developed the Virtual Assistant, Boeing Onboard, in the Spring of 2017. Boeing Onboard is a virtual assistant combined with a holographic interface which all passengers have access to onboard planes. Through augmented reality and wearable glasses, Boeing Onboard has the ability to provide passengers with valuable information, such as safety demonstrations, in-flight entertainment, and web browsing. Boeing Onboard is an in-flight concierge service connecting the passenger to all the resources and information the passenger needs for the ultimate travel experience.

 

Bubble_Hololens Test

Hololens Test. Ming Tang, Mara Marcu. DAAP. UC