Paper at Artificial Realities Conference

 

Cyber-Physical Experiences: Architecture as Interface

Turan Akman and Ming Tang’s Paper Cyber-Physical Experiences: Architecture as Interface was presented at the Artificial Realities: Virtual as an Aesthetic Medium for Architectural Ideation Symposium in Lisbon, Portugal. 2019.

Abstract:

Conventionally, architects have relied on qualities of elements such as materiality, light, solids and voids, etc. to break out of the static nature of space, and enhance the way users experience and perceive architecture. Even though some of these elements and methods helped create more dynamic spaces, architecture is still bound by conventional constraints of the discipline. With the introduction of technologies such as augmented reality(AR), it is becoming easier to blend digital, and physical realities, and create new types of spatial qualities and experiences, especially when it is combined with virtual reality(VR) early in the design process. Even though these emerging technologies cannot replace the primary and conventional qualitative elements in architecture, they can be used to supplement and enhance the experience and qualities architecture provides.

To explore how AR can enhance the way architecture is experienced and perceived, and how VR can be used to enhance the effects of these AR additions, the authors proposed a hybrid museum which integrated AR with conventional analog methods(e.g., materiality, light, etc.) to mediate spatial experiences. To evaluate the proposed space, the authors also created a VR walkthrough and collected quantifiable data on the spatial effects of these AR additions.

Akman,T.Tang,M. Cyber-Physical Experiences: Architecture as Interface at Artificial Realities: Virtual as an Aesthetic Medium for Architectural Ideation Symposium, Lisbon Portugal. 2019

 

An interdisciplinary approach to using eye-tracking technology for design and behavioral analysis

Eye-tracking Tobii workshop offered at CGC, DAAP, UC. 2019.

Eye-tracking devices measure eye position and eye movement, allowing documentation of how environmental elements draw the attention of viewers. In recent years, there have been a number of significant breakthroughs in eye-tracking technology, with a focus on new hardware and software applications. Wearable eye-tracking glasses together with advances in video capture and virtual reality (VR) provide advanced tracking capability at greatly reduced prices.  Given these advances, eye trackers are becoming important research tools in the fields of visual systems, design, psychology, wayfinding, cognitive science, and marketing, among others. Currently, eye-tracking technologies are not readily available at UC, perhaps because of a lack of familiarity with this new technology and how can be used for teaching and research, or the perception of a steep learning curve for its application.

It has become clear to our UC faculty team that research and teaching will significantly benefit from utilizing these cutting-edge tools. It is also clear that a collective approach to acquiring the eye-tracking hardware and software, and training faculty on its application will ultimately result in greater faculty collaboration with its consequent benefits of interdisciplinary research and teaching.

The primary goals of the proposed project are to provide new tools for research and teaching that benefit from cutting-edge eye-tracking technologies involving interdisciplinary groups of UC faculty and students. The project will enhance the knowledge base among faculty and allow new perspectives on how to integrate eye-tracking technology. It will promote interdisciplinarity the broader UC communities.

Grant:

  • “Eye-tracking technology”. Provost Group / Interdisciplinary Award. $19,940. PI: Tang. Co-PI: Auffrey, Hildebrandt. UC. Faculty team: Adam Kiefer,  Michael A. Riley, Julian Wang, Jiaqi Ma, Juan Antonio Islas Munoz, Joori Suh. 2018
  • DAAP matching garnt  $ 11,000.

Hardware: Tobii Pro Glasses, Tobii Pro VR

Software: Tobii Pro Lab, Tobii VR analysis

Test with Tobii Pro Glasses

Hybrid Construction

A hybrid construction using Hololens AR model overlay with the physical structure. The second half of the video is captured through MS Hololens. However, due to the low visibility of the holographic image under sunlight, we are not able to use the AR model to guide installation. Research to be continued….

Installation. SAID, DAAP, University of Cincinnati
Base structure by 1st year SAID, students.
Add-on structure + Augmented Reality by ARCH3014 students.

GA: Robert Peebles, Lauren Meister, Damario Walker-Brown, Jordan Sauer, DanielAnderi. Faculty: Ming Tang

Video captured by 360 camera, MS Hololens, Fologram. Check out full installation image here.

 

Uptown Cincinnati Urban Mobility Studio

Studio Brief

Following the sucess of Fall 2018 Urban Mobility studio, using Cincinnati Uptown and proposed Smart Corridor area as the focus area, the Spring 2019 studio presents a study investigating the urban mobility with an emphasis on the simulated human behavior cues and movement information as input parameters. The research is defined as a hybrid method which seeks logical architecture/urban forms and analyzes its’ performance. As one of the seven-courses-clusters supported by UC Forward, the studio project extends urban mobility study by exploring, collecting, analyzing, and visualizing spatial information and generate computational forms through various Virtual Reality, and eye-tracking, and stress analysis technologies.

The course project was presented at the Uptown Innovation Transportation Corridor Forum 04.31.2019, which showcased students’ smart transportation projects from courses in transportation engineering, urban planning and architecture. Please check out the Uptown Corridor: storymap webpage for other courses outcome at UC.

SAID faculty: Ming Tang. NCARB, RA, LEED AP.

SAID Students: Alan Bossman, Shreya Jasrapuria, Grant Koniski, Jianna Lee, Josiah Ebert, Taylour Upton, Kevin Xu, Yining Fang, Ganesh Raman, Nicole Szparagowski. TA: Niloufar Kioumarsi

Faculty team: DAAP SOP: Na Chen, Xinhao Wang; DAAP SAID: Ming Tang; CEAS Civil Engineering: Heng Wei, Jiaqi Ma;  download Final report. 113 page. PDF.

Selected student projects

Final report of SAID student projects (PDF. 5MB) . Check more rendering images here at course library.

Example of VR Walkthrough (windows OS)

Designed by Tylour Upton. MARCH. SAID, DAAP, UC. download the real time walkthrough here. 2GB zip file

unzip files, double click the exe file to run it under windows OS.

Walkthrough Instruction:

  • navigation.  A, S, W, D
  • Fly: F( turn on/off)
  • Fly up: Q
  • Fly down: Z
  • First person camera control: C ( turn on/off)
  • Jump: space bar
  • Get on/off a truck: E
  • Drive truck: A, S, W, D
  • Turn on truck light: L

 

AR based Digi_Fab

Augmented Reality for Digital Fabrication.  Projects from SAID, DAAP, UC. Fall 2018.

Hololens. Fologram, Grasshopper.

Faculty: Ming Tang, RA, Associate Prof. University of Cincinnati

Students: Alexandra Cole, Morgan Heald, Andrew Pederson,Lauren Venesy,Daniel Anderi, Collin Cooper, Nicholas Dorsey, ,John Garrison, Gabriel Juriga, Isaac Keller, Tyler Kennedy, Nikki Klein, Brandon Kroger, Kelsey Kryspin, Laura Lenarduzzi, Shelby Leshnak, Lauren Meister,De’Sean Morris, Robert Peebles, Yiying Qiu, Jordan Sauer, Jens Slagter, Chad Summe, David Torres, Samuel Williamson, Dongrui Zhu, Todd Funkhouser.

Project team lead: Jordan Sauer, Yiying Qiu, Robert Peebles,David Torres.

 

Videos of working in progress