Posts

Eye Tracking Technology

1. Wearable Eye-tracking technology.

student_ET

This wearable ET device includes various components, such as illuminators, cameras, and a data collection and processing unit for image detection, 3D eye model, and gaze mapping algorithms. Compared to the screen-based ET device, the most significant differences of the wearable ET device are its binocular coverage, a field of view (FOV) and head tilt that has an impact on the glasses-configured eye-tracker.  Also, it avoids potential experimental bias resulting from the display size or pixel dimensions of the screen. Similar to the screen-based ET, the images captured by the wearable ET camera are used to identify the glints on the cornea and the pupil. This information together with a 3D eye model is then used to estimate the gaze vector and gaze point for each participant.

After standard ET calibration and verification procedure, participants were instructed to walk in a defined space while wearing the glasses. In this case, the TOI was set at 60 seconds, recording a defined start and end events with the visual occurrences over that period. In this case, data was collected for both pre-conscious (first three seconds) and conscious viewing (after 3 seconds).

cafe2_h2

DAAP_cafe2_gz

we also did the screen-based eye-tracking and compared the results.

2. Screen-based Eye-tracking technology

This method was beneficial for informing reviewers how an existing place or a proposed design was performing in terms of user experience. Moreover, while the fundamental visual elements that attract human attention and trigger conscious viewing are well-established and sometimes incorporated into signage design and placement, signs face an additional challenge because they must compete for viewers’ visual attention in the context of the visual elements of the surrounding built and natural environments. As such, tools and methods are needed that can assist “contextually-sensitive” design and placement by assessing how signs in situ capture the attention of their intended viewers.

3. VR-Based Eye-tracking technology

Eye-tracking technology enables new forms of interactions in VR, with benefits to hardware manufacturers, software developers, end users and research professionals.

Paper:

Tang. M. Analysis of Signage using Eye-Tracking Technology. Interdisciplinary Journal of Signage and Wayfinding. 02. 2020.

Tang, M. and Auffrey, C. “Advanced Digital Tools for Updating Overcrowded Rail Stations: Using Eye Tracking, Virtual Reality, and Crowd Simulation to Support Design Decision-Making.Urban Rail Transit, December 19, 2018.

Eye Tracking workshop -1 at DAAP

Workshop: Eye tracking with Tobii Pro Glasses,

Location: CGC 4425E, DAAP Building
Time: 11:00am. 09.25.2019.
offered by Ming Tang, RA, LEED AP, DAAP, UC.
Sponsored by the 2018 Provost Group / Interdisciplinary Award

Web Access ( UC Login needed): Tobii Pro Eyetracker workshop source files

online resources

v2-1 v2-2

Content covered:

Connect WLAN. Password “TobiiGlasses”, Run Tobii Pro Glasses Controller. Basic UI and workflow of the following software:

w1 w2

[videojs mp4=”http://ming3d.com/videotutorial/videonew/Tutorial1_capture.mp4″ poster=”http://ming3d.com/videotutorial/videonew/Tutorial1_capture.JPG” width=”400″ height=”225″ preload=”none”]

 

[videojs mp4=”http://ming3d.com/videotutorial/videonew/Tutorial2_analysis.mp4″ poster=”http://ming3d.com/videotutorial/videonew/Tutorial2_analysis.JPG” width=”400″ height=”225″ preload=”none”]

Reference

Tang, M. and Auffrey, C. “Advanced Digital Tools for Updating Overcrowded Rail Stations: Using Eye Tracking, Virtual Reality, and Crowd Simulation to Support Design Decision-Making. Urban Rail Transit, December 19, 2018.

An interdisciplinary approach to using eye-tracking technology for design and behavioral analysis

Eye-tracking devices measure eye position and eye movement, allowing documentation of how environment elements draw the attention of viewers. In recent years, there have been a number of significant breakthroughs in eye-tracking technology, with a focus on new hardware and software applications. Wearable eye-tracking glasses together with advances in video capture and virtual reality (VR) provide advanced tracking capability at greatly reduced prices.  Given these advances, eye trackers are becoming important research tools in the fields of visual systems, design, psychology, wayfinding, cognitive science and marketing, among others. Currently, eye-tracking technologies are not readily available on at UC, perhaps because of a lack of familiarity with this new technology and how can be used for teaching and research, or the perception of a steep learning curve for its application.

It has become clear to our UC faculty team that research and teaching will significantly benefit from utilizing these cutting-edge tools. It is also clear that a collective approach to acquiring the eye-tracking hardware and software, and training faculty on its application will ultimately result in greater faculty collaboration with its consequent benefits of interdisciplinary research and teaching.

The primary goals of the proposed project are to provide new tools for research and teaching that benefit from cutting-edge eye-tracking technologies involving interdisciplinary groups of UC faculty and students. The project will enhance the knowledge base among faculty and allow new perspectives on how to integrate eye-tracking technology. It will promote interdisciplinarity the broader UC communities.

Grant:

  • “Eye-tracking technology”. Provost Group / Interdisciplinary Award. $19,940. PI: Tang. Co-PI: Auffrey, Hildebrandt. UC. Faculty team: Adam Kiefer,  Michael A. Riley, Julian Wang, Jiaqi Ma, Juan Antonio Islas Munoz, Joori Suh. 2018
  • DAAP matching garnt  $ 11,000.

Hardware: Tobii Pro Glasses, Tobii Pro VR

Software: Tobii Pro Lab, Tobii VR analysis

Test with Tobii Pro Glasses

 

Eye-tracking analysis for train station

Some test did with Tobii eye tracker for train station design in Beijing. All heatmap here, all gaze cluster here. 15 train stations were design by students and presented through Unreal Game Engine. More information on the train station design is available in the studio website.

ARCH4002__screenshots__all_sreenshots_Page_49.png__Full_recordings__All_Participants

 

ARCH4002__screenshots__all_sreenshots_Page_30.png__Full_recordings__All_Participants

eye-tracking for way-finding

study on eye-tracking for way-finding. The building on fire is  UC DAAP building. The color represent fixation duration, and gaze count.

 

gaze