Eye-tracking devices measure eye position and eye movement, allowing documentation of how environment elements draw the attention of viewers. In recent years, there have been a number of significant breakthroughs in eye-tracking technology, with a focus on new hardware and software applications. Wearable eye-tracking glasses together with advances in video capture and virtual reality (VR) provide advanced tracking capability at greatly reduced prices. Given these advances, eye trackers are becoming important research tools in the fields of visual systems, design, psychology, wayfinding, cognitive science and marketing, among others. Currently, eye-tracking technologies are not readily available on at UC, perhaps because of a lack of familiarity with this new technology and how can be used for teaching and research, or the perception of a steep learning curve for its application.
It has become clear to our UC faculty team that research and teaching will significantly benefit from utilizing these cutting-edge tools. It is also clear that a collective approach to acquiring the eye-tracking hardware and software, and training faculty on its application will ultimately result in greater faculty collaboration with its consequent benefits of interdisciplinary research and teaching.
The primary goals of the proposed project are to provide new tools for research and teaching that benefit from cutting-edge eye-tracking technologies involving interdisciplinary groups of UC faculty and students. The project will enhance the knowledge base among faculty and allow new perspectives on how to integrate eye-tracking technology. It will promote interdisciplinarity the broader UC communities.
- “Eye-tracking technology”. Provost Group / Interdisciplinary Award. $19,940. PI: Tang. Co-PI: Auffrey, Hildebrandt. UC. Faculty team: Adam Kiefer, Michael A. Riley, Julian Wang, Jiaqi Ma, Juan Antonio Islas Munoz, Joori Suh. 2018
- DAAP matching garnt $ 11,000.
Software: Tobii Pro Lab, Tobii VR analysis
Test with Tobii Pro Glasses
Grant supported by the Ohio Department of Transportation.
PI: Jiaqi Ma. Co-PI: Ming Tang. Julian Wang.
Evaluate Opportunities to Provide Training Simulation for ODOT Snow and Ice Drivers – Phase 1
8/2018 – 3/2019
The purpose of this research is to conduct an in-depth analysis of ODOT’s current process for training snow and ice drivers and provide recommendations on how to enhance their snow and ice experience.
To accomplish this research, the scope of work should be divided into two phases. The scope of work should include, at a minimum, the activities noted below. Additional tasks may be included in the proposal by the UC team as appropriate to ensure achievement of research objectives. ODOT’s decision to invest in Phase 2 will be based on Phase 1 interim report and recommendations, and it will consider both ODOT’s ability to implement and the expected cost-benefit.
The first phase of the research requires a comprehensive look at how ODOT currently trains their snow and ice drivers and a review of nationwide practices for snow and ice driver training. An analysis of the past practices shall be considered. During Phase 1, the UC team will work closely with ODOT Lorain County personnel. The recommendations will be developed by reviewing and documenting ODOT’s current practices, past practices, and then developing a matrix of choices and opportunities to enhance ODOT’s travel time recovery.
In order to understand the available technologies for training snow and ice drivers, we performed a preliminary literature search and found the key technologies can be categorized into driving simulator-based, VR-based, AR-based type. In the following section, we will discuss these three simulation training types, and some previous associated works conducted by PIs are also provided.
“Evaluate Opportunities to Provide Training Simulation for ODOT Snow and Ice Drivers”.
Funded by Ohio Department of Transportation
PI: Jiaqi Ma. Co-PI: Ming Tang, Julian Wang.
Phase I. $39,249.ODOT 32391 / FHWA. 2018-2019
Phase II: $952,938. Grant #: 1014440. ODOT 32391 / FHWA. 2019-2011
SAID students: Sam Dezarn, Ganesh Raman, Dongrui Zhu, Jordan Sauer, Niloufar Kioumarsi.
Download the demo file. zip (1GB)
instruction: “C” switch camera view. “E” get in/out car, “L” trun on/off light. “space bar” break. “W”, “A”, “S”, “D” for navigation.
Gamepad: Right Trigger, to start engine, Left Trigger tostop engine.
Augmented Reality for Digital Fabrication. Projects from SAID, DAAP, UC. Fall 2018.
Hololens. Fologram, Grasshopper.
Faculty: Ming Tang, RA, Associate Prof. University of Cincinnati
Students: Alexandra Cole, Morgan Heald, Andrew Pederson,Lauren Venesy,Daniel Anderi, Collin Cooper, Nicholas Dorsey, ,John Garrison, Gabriel Juriga, Isaac Keller, Tyler Kennedy, Nikki Klein, Brandon Kroger, Kelsey Kryspin, Laura Lenarduzzi, Shelby Leshnak, Lauren Meister,De’Sean Morris, Robert Peebles, Yiying Qiu, Jordan Sauer, Jens Slagter, Chad Summe, David Torres, Samuel Williamson, Dongrui Zhu, Todd Funkhouser.
Project team lead: Jordan Sauer, Yiying Qiu, Robert Peebles,David Torres.
Videos of working in progress