Posts

Thesis: Layered Space

This is the thesis book of my graduate student Adam Sambuco: 

Layered Space

Toward an Architecture of Superimposition

by Adam J. Sambuco
University of Cincinnati, 2018

Degree. Master of Architecture

Thesis Chair. Ming Tang

Historically, the physical nature of architecture has caused it to remain functionally static despite evolving theories, materials, and technologies. The design of spaces and the actions of occupants are fundamentally limited by the laws of physics. This thesis and associated project explore and present ways in which architectural spaces can incorporate extended reality to enhance the design and use of buildings in ways that were not previously possible. Due to their part physical, part-virtual nature, superimposed spaces can change over time, on demand, or contextually, based on their inhabitants. Extended reality can assist with wayfinding, socialization, organization, personalization, contextualization, and more. This thesis asserts that it is essential for architects to familiarize themselves with this technology, exploring new methods of design and presentation for such radically different end products.

It is with this in mind that this document establishes the basic functionality, terminology, and history of extended reality before moving on to more modern capabilities. After a glimpse into the near future of XR and a look at its relationship to architecture, the philosophical basis for treating the virtual as real is explored. Having establishing its history, functionality, and reality, the idea of spatial superimposition is then explored through the lenses of visitor, designer, and presenter. My previous work is then covered, touching on how XR technology will become normalized in society and investigating an approach to XR renovations that brings virtual mansions to the masses. Finally, my thesis project, an XR-enabled media the que in downtown Dallas, is introduced and my processes of creation, experimentation, and presentation are detailed so that others might learn from and build off them. Despite its large scope and cutting-edge subject matter, this work scrutinizes only a small portion of the changes that extended reality will undoubtedly bring to architecture and greater society.

View the full thesis book. 168 pages. 14MB 

 

VR show in 2018 DAAPworks

VR show of Prof. Ming Tang’s Architecture studio at UC DAAP, a mix of virtual reality and augmented reality style exhibition. 

Faculty: Ming Tang, Xiaoying Meng

Student: Gabriel Berning, Bhattiprolu Chamundi Saila Snigdha, Owen Blodgett, Mason Boling, Tyler Dunn, Michael Greer, Isaac Keller, Anna Kick,Connor Kramer, Nathan Mohamedali, Aashna Sharad Poddar, Yiying Qiu, Jordan Sauer, Edward Simpson, Dongrui Zhu

 

 

Location: CGC Computer Lab ( 4425 E), DAAP, UC.

Project: Train Station in Beijing, China. Studio brief.

Exhibition  time. 04.24-04.27.

 

 

Pleae check out 15 project posters here. 

 

 

project featured in UC Magazine

Flight of the future

UC students, faculty and industry leaders converge at the Live Well Collaborative to create innovative, internationally recognized technology for Boeing.

The project was norminated as the finalist of 2018 Crystal Cabin Award and exhibited at the Crystal Cabin Award at the Aircraft Interior Expo at the Hamburg Messe, Germany 04. 10-12.2018

read the full article here. 

By Jac Kern. UC Magazine

“DAAP professor Ming Tang’s specialty in design visualization, using interactive media like VR and AR to communicate a design concept, made him a perfect fit for this project.

“Sometimes you need a really strong visual to sell an idea,” Tang explains. “We quickly set up a pipeline involving students with graphic design, 3-D modeling and animation skills, scripting and programming as well as user interface. The team assembled some very big ideas into a model people can see and even interact with in VR and AR.”

read the full article here. 

 

paper accepted at CAADRIA conference

Ming Tang’s paper From agent to avatar: Integrate avatar and agent simulation in the virtual reality for wayfinding is accepted at the Association for Computer-Aided Architectural Design Research in Asia (CAADRIA) 2018 conference in Beijing, China.  This paper describes a study of using immersive virtual reality (VR) technology to analyze user behavior related to wayfinding, and integrated it with the multi-agent simulation and space syntax. Starting with a theoretical framework, the author discussed the constraints of agent-based simulation (ABS) and space syntax to construct the micro-level interactions within a simulated environment. The author then focuses on how cognitive behavior and spatial knowledge can be achieved with a player controlled avatar in response to other computer controlled agents in a VR environment. The multi-phase approach starts with defining the Avatar Agent VR system (AAVR), which is used for capturing an avatar’s movement in real time and form the spatial data, and then visualize the data with various representation methods. Combined with space syntax and ABS, AAVR can exam various avatars’ wayfinding behavioral related to gender, spatial recognition level, and spatial features such as light, sound, and architectural simulations.

Check out the full paper there:

Tang, M. From agent to avatar: Integrate avatar and agent simulation in the virtual reality for wayfinding. Proceedings of the 23rd International Conference of the Association for Computer-Aided Architectural Design Research in Asia (CAADRIA). Beijing, China. 2018.

Virtual DAAP

Virtual DAAP extends to the discussion of technological constraints of VR such as field of vision, peripheral vision, and vestibular indices. The multi-phase approach starts with defining the immersive VR system, which is used for capturing real agent’s movement within a digital environment to form raw data in the cloud, and then visualize it with heat-map and path network. Combined with graphs, survey data is also used to compare various agents’ way-finding behavioral related to gender, spatial recognition level, and spatial features such as light, sound, and architectural simulations.

 

More information about virtual DAAP.