article in IJSW journal

Ming Tang’s paper. Analysis of Signage using Eye-Tracking Technology is published at the  Interdisciplinary Journal of Signage and Wayfinding. 02. 2020.

Abstract

Signs, in all their forms and manifestations, provide visual communication for wayfinding, commerce, and public dialogue and expression. Yet, how effectively a sign communicates and ultimately elicits a desired reaction begins with how well it attracts the visual attention of prospective viewers. This is especially the case for complex visual environments, both outside and inside of buildings. This paper presents the results of an exploratory research design to assess the use of eye-tracking (ET) technology to explore how placement and context affect the capture of visual attention. Specifically, this research explores the use of ET hardware and software in real-world contexts to analyze how visual attention is impacted by location and proximity to geometric edges, as well as elements of contrast, intensity against context, and facial features. Researchers also used data visualization and interpretation tools in augmented reality environments to anticipate human responses to alternative placement and design. Results show that ET methods, supported by the screen-based and wearable eye-tracking technologies, can provide results that are consistent with previous research of signage performance using static images in terms of cognitive load and legibility, and ET technologies offer an advanced dynamic tool for the design and placement of signage.

Issue

ACKNOWLEDGMENT
The research project is supported by the Strategic Collaborative/Interdisciplinary Award of the University of Cincinnati. Thanks to the support from Professor Christopher Auffrey, students from ARCH7014, Fall 2019 semester, ARCH8001 Spring 2019 semester, and ARCH4001, Fall 2018 semester at the University of Cincinnati.

For more information on the wearable ET, screen-based ET, and VR-ET, please check out our research website, or contact Prof. Tang.

 

Digital Twin of Ohio Highway. Training Simulation for snowplow

Ming Tang led the team that constructed six large-scale Ohio highway digital twins for snowplow drivers. The road models, which are 70 miles long and span three Ohio counties, were based on real-site GIS and TOPO data.

“Evaluate Opportunities to Provide Training Simulation for ODOT Snow and Ice Drivers”. Phase 2. Ohio Department of Transportation. PI: John Ash. Co-PI: Ming Tang, Frank Zhou, Mehdi Norouzi, $952,938. Grant #: 1014440. ODOT 32391 / FHWA Period: 01.2019-03.2022.

“Evaluate Opportunities to Provide Training Simulation for ODOT Snow and Ice Drivers”. Phase-1. Ohio Department of Transportation. PI: Jiaqi Ma. Co-PI: Ming Tang, Julian Wang. $39,249. ODOT 32391 / FHWA. Period: 01.2018- 01.2019.

 

 

Lorain County. District 3. 

Hoicking Mountain District 10.

City of Columbus-south A

 

City of Columbus-West B

City of Columbus-North. C

 

Publications

Raman, M., Tang, M3D Visualization Development of Urban Environments for Simulated Driving Training and VR Development in Transportation Systems. ASCE ICTD 2023 Conference. Austin. TX. 06. 2023

Paper at Artificial Realities Conference

 

Cyber-Physical Experiences: Architecture as Interface

Turan Akman and Ming Tang’s Paper Cyber-Physical Experiences: Architecture as Interface was presented at the Artificial Realities: Virtual as an Aesthetic Medium for Architectural Ideation Symposium in Lisbon, Portugal. 2019.

Abstract:

Conventionally, architects have relied on qualities of elements such as materiality, light, solids and voids, etc. to break out of the static nature of space, and enhance the way users experience and perceive architecture. Even though some of these elements and methods helped create more dynamic spaces, architecture is still bound by conventional constraints of the discipline. With the introduction of technologies such as augmented reality(AR), it is becoming easier to blend digital, and physical realities, and create new types of spatial qualities and experiences, especially when it is combined with virtual reality(VR) early in the design process. Even though these emerging technologies cannot replace the primary and conventional qualitative elements in architecture, they can be used to supplement and enhance the experience and qualities architecture provides.

To explore how AR can enhance the way architecture is experienced and perceived, and how VR can be used to enhance the effects of these AR additions, the authors proposed a hybrid museum which integrated AR with conventional analog methods(e.g., materiality, light, etc.) to mediate spatial experiences. To evaluate the proposed space, the authors also created a VR walkthrough and collected quantifiable data on the spatial effects of these AR additions.

Akman,T.Tang,M. Cyber-Physical Experiences: Architecture as Interface at Artificial Realities: Virtual as an Aesthetic Medium for Architectural Ideation Symposium, Lisbon Portugal. 2019

 

An interdisciplinary approach to using eye-tracking technology for design and behavioral analysis

Eye-tracking Tobii workshop offered at CGC, DAAP, UC. 2019.

Eye-tracking devices measure eye position and eye movement, allowing documentation of how environmental elements draw the attention of viewers. In recent years, there have been a number of significant breakthroughs in eye-tracking technology, with a focus on new hardware and software applications. Wearable eye-tracking glasses together with advances in video capture and virtual reality (VR) provide advanced tracking capability at greatly reduced prices.  Given these advances, eye trackers are becoming important research tools in the fields of visual systems, design, psychology, wayfinding, cognitive science, and marketing, among others. Currently, eye-tracking technologies are not readily available at UC, perhaps because of a lack of familiarity with this new technology and how can be used for teaching and research, or the perception of a steep learning curve for its application.

It has become clear to our UC faculty team that research and teaching will significantly benefit from utilizing these cutting-edge tools. It is also clear that a collective approach to acquiring the eye-tracking hardware and software, and training faculty on its application will ultimately result in greater faculty collaboration with its consequent benefits of interdisciplinary research and teaching.

The primary goals of the proposed project are to provide new tools for research and teaching that benefit from cutting-edge eye-tracking technologies involving interdisciplinary groups of UC faculty and students. The project will enhance the knowledge base among faculty and allow new perspectives on how to integrate eye-tracking technology. It will promote interdisciplinarity the broader UC communities.

Grant:

  • “Eye-tracking technology”. Provost Group / Interdisciplinary Award. $19,940. PI: Tang. Co-PI: Auffrey, Hildebrandt. UC. Faculty team: Adam Kiefer,  Michael A. Riley, Julian Wang, Jiaqi Ma, Juan Antonio Islas Munoz, Joori Suh. 2018
  • DAAP matching garnt  $ 11,000.

Hardware: Tobii Pro Glasses, Tobii Pro VR

Software: Tobii Pro Lab, Tobii VR analysis

Test with Tobii Pro Glasses

Hybrid Construction

A hybrid construction using Hololens AR model overlay with the physical structure. The second half of the video is captured through MS Hololens. However, due to the low visibility of the holographic image under sunlight, we are not able to use the AR model to guide installation. Research to be continued….

Installation. SAID, DAAP, University of Cincinnati
Base structure by 1st year SAID, students.
Add-on structure + Augmented Reality by ARCH3014 students.

GA: Robert Peebles, Lauren Meister, Damario Walker-Brown, Jordan Sauer, DanielAnderi. Faculty: Ming Tang

Video captured by 360 camera, MS Hololens, Fologram. Check out full installation image here.