Protected: Book: Design in the Age of Extended Reality

This content is password protected. To view it please enter your password below:

Eye-Tracking for Drivers’ Visual Behavior

Impacts of Work Zone Traffic Signage Devices and Environment Complexity on Drivers’ Visual Behavior and Workers Safety.

Ph.D student: Adebisi, Adekunle. CEAS – Civil & Arch Eng & Const Mgmt

Undergraduate student: Nathan Deininger, 

Faculty. Ming Tang

The objective of this study is to investigate the safety of roadway workers under varying environmental and work zone conditions. To achieve the objectives, a driving simulator-based experiment is proposed to evaluate drivers’ visual attention under various work zone scenarios using eye-tracking technologies.

Grant.

  • Using Eye- Tracking to Study the Effectiveness of Visual Communication. UHP Discovery funding. University Honor Program. UC. $5,000. Faculty advisor. 2021.
  • Adekunle Adebisi  (Ph.D student at the College of Engineering and Applied Science) applied and received a $3,200 Emerging Fellowship Award By Academic Advisory Council for Signage Research and Education (AACSRE).

 

Tableau test

Power BI test

test PowerBI

paper accepted at CAADRIA conference

Ming Tang’s paper From agent to avatar: Integrate avatar and agent simulation in the virtual reality for wayfinding is accepted at the Association for Computer-Aided Architectural Design Research in Asia (CAADRIA) 2018 conference in Beijing, China.  This paper describes a study of using immersive virtual reality (VR) technology to analyze user behavior related to wayfinding, and integrated it with the multi-agent simulation and space syntax. Starting with a theoretical framework, the author discussed the constraints of agent-based simulation (ABS) and space syntax to construct the micro-level interactions within a simulated environment. The author then focuses on how cognitive behavior and spatial knowledge can be achieved with a player controlled avatar in response to other computer controlled agents in a VR environment. The multi-phase approach starts with defining the Avatar Agent VR system (AAVR), which is used for capturing an avatar’s movement in real time and form the spatial data, and then visualize the data with various representation methods. Combined with space syntax and ABS, AAVR can exam various avatars’ wayfinding behavioral related to gender, spatial recognition level, and spatial features such as light, sound, and architectural simulations.

Check out the full paper there:

Tang, M. From agent to avatar: Integrate avatar and agent simulation in the virtual reality for wayfinding. Proceedings of the 23rd International Conference of the Association for Computer-Aided Architectural Design Research in Asia (CAADRIA). Beijing, China. 2018.