Rural Mobile Living

Image credit: Student award. 2020 DAAPworks Director’s DAAPcares award. Portable Disaster Relief. Students: Noah Nicolette, Jamie Waugaman, Travis Rebsch. 2020

ARCH 4002. Spring 2020
SAID DAAP, University of Cincinnati

Using the 10 miles rural area along I-90 at Lorain County, Ohio as the site, this Rural Mobile Living studio presents a study investigating the rural mobility with an emphasis on architecture as infrastructure and its connection to the means of transportation. Work closely with the Vehicle Design Studio in the School of Design, the research intended to realize the potential of the self-driving car, smart technology, artificial intelligence, machine learning into the architectural design process and address problems such as poverty, lack of transportation means and under-developed infrastructure. Ultimately, the studio looks to build upon the strengths of both vehicle design and architecture methods and explore the possible design solutions for the following five scenarios in the rural areas: “shared living, working homeless, digital nomad, disaster relief, and tourism recreation.

Faculty: Ming Tang
Students: Nick Chism, Maddie Cooke, Amy Cui, Noah Nicolette, Travis Rebsch, Vu Tran Huy Phi, Kristian Van Wiel, David Wade, Jamie Waugaman, Adam Baca. SAID, DAAP.

Award:

Student award. 2020 DAAPworks Director’s DAAPcares award. Portable Disaster Relief. Students: Noah Nicolette, Jamie Waugaman, Travis Rebsch. 2020

Collaborator: Vehicle Design studio. Juan Antonio Islas Munoz, School of Design, DAAP.

Acknowledgment

Thanks for the support from Autodesk Cloud-based computing BIM360.  

Demo

  1. Download our game here. “DCM.zip” (1.2GB) password “daapworks@2020”
  2. Unzip and Run the exe file

 

How to use the interface

  • use M to turn on/off Menu
  • use A W S D or Arrow key to move/drive
  • use C to switch the camera between the first person to the third person
  • use space bar to stop a car
  • use E to get in/out of a car
  • use F to turn on/off flying mode. Then use Q, Z to fly up and down. ( only as a host server or single-player mode)
  • walk into the “glowing green box” to teleport

For Multi-player game

A. Set up Steam on your computer

  1. Set up a Steam account and install Steam in your computer.
  2. Run the Steam program on your computer.
  3. Add Ming Tang as your friend. Friend code “301687106”

B. Join a multiplayer session.

  1. Make sure you use “Internet”, not “Lan”. Single-click the found session, not double click.
  2. You should be able to use “Shift + Tab” to turn on the Steam overlay. Ask questions in the Steam chat room.
  3. Choose the “Find games” option. Once you find an open session, double click the name to join the game.

article in IJSW journal

Ming Tang’s paper. Analysis of Signage using Eye-Tracking Technology is published at the  Interdisciplinary Journal of Signage and Wayfinding. 02. 2020.

Abstract

Signs, in all their forms and manifestations, provide visual communication for wayfinding, commerce, and public dialogue and expression. Yet, how effectively a sign communicates and ultimately elicits a desired reaction begins with how well it attracts the visual attention of prospective viewers. This is especially the case for complex visual environments, both outside and inside of buildings. This paper presents the results of an exploratory research design to assess the use of eye-tracking (ET) technology to explore how placement and context affect the capture of visual attention. Specifically, this research explores the use of ET hardware and software in real-world contexts to analyze how visual attention is impacted by location and proximity to geometric edges, as well as elements of contrast, intensity against context, and facial features. Researchers also used data visualization and interpretation tools in augmented reality environments to anticipate human responses to alternative placement and design. Results show that ET methods, supported by the screen-based and wearable eye-tracking technologies, can provide results that are consistent with previous research of signage performance using static images in terms of cognitive load and legibility, and ET technologies offer an advanced dynamic tool for the design and placement of signage.

Issue

ACKNOWLEDGMENT
The research project is supported by the Strategic Collaborative/Interdisciplinary Award of the University of Cincinnati. Thanks to the support from Professor Christopher Auffrey, students from ARCH7014, Fall 2019 semester, ARCH8001 Spring 2019 semester, and ARCH4001, Fall 2018 semester at the University of Cincinnati.

For more information on the wearable ET, screen-based ET, and VR-ET, please check out our research website, or contact Prof. Tang.

 

Digital Twin of Ohio Highway. Training Simulation for snowplow

Ming Tang led the team that constructed six large-scale Ohio highway digital twins for snowplow drivers. The road models, which are 70 miles long and span three Ohio counties, were based on real-site GIS and TOPO data.

“Evaluate Opportunities to Provide Training Simulation for ODOT Snow and Ice Drivers”. Phase 2. Ohio Department of Transportation. PI: John Ash. Co-PI: Ming Tang, Frank Zhou, Mehdi Norouzi, $952,938. Grant #: 1014440. ODOT 32391 / FHWA Period: 01.2019-03.2022.

“Evaluate Opportunities to Provide Training Simulation for ODOT Snow and Ice Drivers”. Phase-1. Ohio Department of Transportation. PI: Jiaqi Ma. Co-PI: Ming Tang, Julian Wang. $39,249. ODOT 32391 / FHWA. Period: 01.2018- 01.2019.

 

 

Lorain County. District 3. 

Hoicking Mountain District 10.

City of Columbus-south A

 

City of Columbus-West B

City of Columbus-North. C

 

Publications

Raman, M., Tang, M3D Visualization Development of Urban Environments for Simulated Driving Training and VR Development in Transportation Systems. ASCE ICTD 2023 Conference. Austin. TX. 06. 2023

Paper at Artificial Realities Conference

 

Cyber-Physical Experiences: Architecture as Interface

Turan Akman and Ming Tang’s Paper Cyber-Physical Experiences: Architecture as Interface was presented at the Artificial Realities: Virtual as an Aesthetic Medium for Architectural Ideation Symposium in Lisbon, Portugal. 2019.

Abstract:

Conventionally, architects have relied on qualities of elements such as materiality, light, solids and voids, etc. to break out of the static nature of space, and enhance the way users experience and perceive architecture. Even though some of these elements and methods helped create more dynamic spaces, architecture is still bound by conventional constraints of the discipline. With the introduction of technologies such as augmented reality(AR), it is becoming easier to blend digital, and physical realities, and create new types of spatial qualities and experiences, especially when it is combined with virtual reality(VR) early in the design process. Even though these emerging technologies cannot replace the primary and conventional qualitative elements in architecture, they can be used to supplement and enhance the experience and qualities architecture provides.

To explore how AR can enhance the way architecture is experienced and perceived, and how VR can be used to enhance the effects of these AR additions, the authors proposed a hybrid museum which integrated AR with conventional analog methods(e.g., materiality, light, etc.) to mediate spatial experiences. To evaluate the proposed space, the authors also created a VR walkthrough and collected quantifiable data on the spatial effects of these AR additions.

Akman,T.Tang,M. Cyber-Physical Experiences: Architecture as Interface at Artificial Realities: Virtual as an Aesthetic Medium for Architectural Ideation Symposium, Lisbon Portugal. 2019

 

An interdisciplinary approach to using eye-tracking technology for design and behavioral analysis

Eye-tracking Tobii workshop offered at CGC, DAAP, UC. 2019.

Eye-tracking devices measure eye position and eye movement, allowing documentation of how environmental elements draw the attention of viewers. In recent years, there have been a number of significant breakthroughs in eye-tracking technology, with a focus on new hardware and software applications. Wearable eye-tracking glasses together with advances in video capture and virtual reality (VR) provide advanced tracking capability at greatly reduced prices.  Given these advances, eye trackers are becoming important research tools in the fields of visual systems, design, psychology, wayfinding, cognitive science, and marketing, among others. Currently, eye-tracking technologies are not readily available at UC, perhaps because of a lack of familiarity with this new technology and how can be used for teaching and research, or the perception of a steep learning curve for its application.

It has become clear to our UC faculty team that research and teaching will significantly benefit from utilizing these cutting-edge tools. It is also clear that a collective approach to acquiring the eye-tracking hardware and software, and training faculty on its application will ultimately result in greater faculty collaboration with its consequent benefits of interdisciplinary research and teaching.

The primary goals of the proposed project are to provide new tools for research and teaching that benefit from cutting-edge eye-tracking technologies involving interdisciplinary groups of UC faculty and students. The project will enhance the knowledge base among faculty and allow new perspectives on how to integrate eye-tracking technology. It will promote interdisciplinarity the broader UC communities.

Grant:

  • “Eye-tracking technology”. Provost Group / Interdisciplinary Award. $19,940. PI: Tang. Co-PI: Auffrey, Hildebrandt. UC. Faculty team: Adam Kiefer,  Michael A. Riley, Julian Wang, Jiaqi Ma, Juan Antonio Islas Munoz, Joori Suh. 2018
  • DAAP matching garnt  $ 11,000.

Hardware: Tobii Pro Glasses, Tobii Pro VR

Software: Tobii Pro Lab, Tobii VR analysis

Test with Tobii Pro Glasses