Emotional Architecture in the Digital Age

Emotional Architecture in the Digital Age: Generative AI and Virtual Environments for Human-Centric Design

ARCH 7036-04/ARCH5051-04.Elective Theory Seminar, SAID, DAAP, Spring 2024

Faculty: Ming Tang, Professor in Architecture and Interior Design, DAAP. Director of  Extended Reality Lab. XR-Lab

Gen-AI with Stable Diffusion, by Ming Tang

In his article The Case for a Feeling Architecture and Architecture Is a Deeply Emotional Experience, Jacob DiCrescenzo emphasizes the importance of designing architecture that prioritizes human emotions over abstract concepts. He advocates for a paradigm shift toward emotional responsiveness in design, creating spaces that enrich occupants’ well-being and experience. By integrating insights from neuroscience, architects can craft impactful, memorable environments, harnessing emotion as a core element of human cognition to make buildings more meaningful and engaging. In Emotional Architecture, Arushi Malhotra describes emotional architecture as a two-fold interaction: cognitive, where users interpret space, and emotional, where they react to it. By carefully manipulating elements such as light, color, and sound, architects can create environments that foster positive emotional responses, thereby enhancing user experience and potentially influencing mood and productivity.

On the other hand, the architecture of illusion, as explored by Richard Haas [1] and M.C. Escher, and AI-generated images at Craiyon create a world where perspective and spatial perception are manipulated to challenge the viewer’s understanding of reality. These artists employ techniques that distort conventional architectural forms, constructing scenes that defy gravity, continuity, and logic. Their work invites viewers into spaces that feel simultaneously familiar and impossible, where each element appears coherent individually, yet together, they form an intricate, mind-bending puzzle. Through the use of optical illusions, they create a captivating experience that blurs the line between the real and the surreal.

Course Brief:

This seminar focuses on the intersection of emotional architecture, generative AI, and immersive virtual environments to understand how spatial design impacts human perception and emotional response. Students will leverage cutting-edge technologies to examine how elements like color, light, sound, and spatial layout shape neural responses, eye fixation patterns, and overall emotional experiences within architectural spaces.

Through AI-driven simulations and virtual reality, participants will study how specific architectural features influence occupants’ psychological and physiological reactions. Using eye-tracking technology and biometric feedback, students will gain valuable insights into user interactions and emotional responses to various spatial configurations, developing skills to design spaces that resonate emotionally and enhance human well-being.

This course will explore how architectural experiences can evoke and enhance a range of emotions, including happiness, sadness, fear, surprise, love, excitement, calmness, awe, confusion, trust, pride, relief, compassion, loneliness, and anticipation. Additionally, students will build technical skills in visualizing architectural spaces in VR and assessing emotional responses, preparing them to create impactful, emotion-centered environments.

 

Gen-AI with Stable Diffusion, by Ming Tang

Skills covered:

Unreal Engine 5, Generative AI tools

Key Topics:

  • Theories of emotional architecture and human-centered design.
  • Exploration of architectural drawings from Libbeus Woods, Zaha Hadid about styles and human emotions. 
  • Applications of generative AI in creating adaptive, responsive spatial features
  • Use Unreal Engine to create virtual reality and immersive media for simulating and testing architectural impact
  • Techniques in eye-tracking and biometric measurement to assess engagement and emotional response
  • Analyzing the influence of spatial elements—color, light, sound, material textures—on mood and neural activity
  • Ethical considerations and implications of designing for emotional impact

Eye-tracking heatmap of CVG airport. Tobii VR. 

Learning Outcomes: By the end of this course, students will:

  1. Understand the foundational theories of emotional architecture and its relevance in contemporary design.
  2. Apply generative AI tools to create and adapt architectural designs based on emotional and perceptual data.
  3. Conduct virtual reality, eye-tracking, and biometrics experiments to gauge human responses to spatial features.
  4. Critically analyze and discuss the emotional impact of environmental factors in architecture.
  5. Produce a generative AI-based design project incorporating findings on human emotional responses to enhance user experience in virtual and real spaces.

Assessment:
Assessment will be based on a combination of participation in VR-based labs, a midterm report on emotional architecture theory, and a final project integrating generative AI with VR to present an emotionally engaging spatial design prototype.

Course Prerequisites:
This seminar is designed for advanced undergraduate or graduate students in architecture, design, or related fields. Familiarity with basic architectural design concepts and introductory experience in 3D modeling or AI is recommended but not required.

Tool Description Example
Unreal Engine 5 Used for creating immersive walkthrough and drive-through experiences in virtual environments CVG Airport Renovation;   2023 elective seminar
Gen-AI Utilizes Stable Diffusion with ComfyUI, ControlNet, and 360-degree Lora for generating design concepts and spatial visuals Blink Exhibition, Spatial Computing. ARCH Design Studio  AI and Architecture
Eye-Tracking Tobii VR system for tracking user gaze and interaction within virtual environments to assess wayfinding and emotional response Eye-tracking study on the signage and wayfinding at CVG

Weekly Schedule

Week Topic Skills Covered Deliverable
1 Architecture & Emotions Introduction to emotional architecture theory and its impact on design; exploration of case studies
2 Gen-AI Using Generative AI tools like Stable Diffusion, ControlNet, and ComfyUI for creating spatial concepts based on emotional responses
3 Virtual Reality Introduction to VR fundamentals, Unreal Engine UI, and GitHub for project collaboration
4 VR Content Building VR content in Unreal Engine, focusing on modeling, materials, and lighting to enhance emotional impact Concept
5 VR Interactions Developing interactive elements in VR that respond to user movements and actions, enhancing immersion and engagement
6 Eye-Tracking Introduction to eye-tracking technology to analyze user focus and interaction within VR environments
7-15 Working Time Independent project work to refine VR environments, incorporating feedback loops with eye-tracking data to optimize emotional impact Prototype
16 Final Presentation Presentation of final VR environment, demonstrating emotional and interactive elements to a panel or peers Final Presentation

Reading

Appendix

Unreal student projects. CVG

Stable Diffusion with Control Net 

Stable Fast 3D with ComfyUI

 

 

UHP seminar 2025

ARCH 3051 Spring 2025

UHP. Extended Reality (XR) Technologies for Real-World Problem Solving

Call for UC Honors Students.

Ming Tang. Director, Extended Reality Lab. XR-Lab
Professor,
Institute for Research in Sensing (IRiS); Industry 4.0 & 5.0 Institute (I45I)
School of Architecture and Interior Design, College of Design Architecture Art and Planning, University of Cincinnati
Class Time: T. TH 12:30 pm – 1:50 PM: 4425 E. CGC Lab at DAAP / on Teams
Office hours: T/TH 11:00 am-noon. Friday 11-noon on Teams

Course Description:

This seminar course delves into the practical applications of Extended Reality (XR) technologies to address real-world challenges, emphasizing the convergence of human-computer interaction (HCI) and immersive visualization tools such as Virtual Reality (VR), Augmented Reality (AR), Mixed Reality (MR), and the metaverse. Students will create novel embodied experiences and explore human perception, emotion, and reaction through analysis. Through ideation and prototyping, the course provides a solid foundation in theoretical frameworks and hands-on skills essential for XR development. The course aims to leverage gamification for skill and knowledge acquisition and immersive storytelling to foster empathy. Additionally, students will gain experience with advanced VR technology at UC’s Digital Futures building, with opportunities to undertake individual or collaborative projects tackling global challenges, prioritizing innovation, empathy, and inclusivity.

Software: Unreal Engine. This course is designed for non-STEM students, with no prior programming or 3D modeling experience required. AI tools and digital assets will be provided to aid content creation. STEM students are also welcome and encouraged to further develop their programming and modeling skills through Unreal Engine.

Hardware: Oculus Quest 2, Quest 3, and Quest Pro.

Course Goals:

By the end of this course, students will have developed a strong foundation in XR theories and research methodologies, with core topics including:

  • Historical and Theoretical Foundations of XR: Understanding the evolution of XR within the contexts of digital innovation and inclusivity.
  • Project Development: Creating individual projects that address global challenges through XR applications. Past projects have explored areas such as immersive storytelling, drone delivery simulations, music visualization, medical training, and digital heritage preservation.
  • XR for Knowledge and Skill Acquisition: Recognizing the potential of XR to enhance technical and interpersonal skills.
  • Applications in Social Impact: Investigating the use of XR to address empathy, for example, PTSD, xenophobia, anxiety, mental health, and other pressing social issues.

Through experiential learning and reflective practices, students will demonstrate proficiency in key XR development skills, including:

  • Ideation and Concept Development: Using storyboarding and visualization techniques to develop concepts.
  • Content Creation with Generative AI: Leveraging AI tools to produce digital content.
  • VR Experience Design: Focusing on gamification to enhance learning outcomes.
  • Entrepreneurship in XR: Exploring business and innovation opportunities within the XR industry.

Required Text(s): None
Recommended Text(s): Selected online lectures and readings

Students are encouraged to explore XR products and publications throughout their research. These resources will serve as both inspiration and motivation. The instructor will support students in experimenting with techniques beyond the standard curriculum, fostering a deeper and more innovative engagement with XR

Weekly Schedule

Week Topic Skills Covered Deliverable
1 XR History and Theory Unreal UI, GitHub  
2 VR for Knowledge and Skill Acquisition Visualization  
3 Digital Assets Gen-AI, 3d assets library Concept
4 Storytelling UX & interactions  
5 Empathy & Inclusion    
6 Independent Work or Additional Topics    
7-15 Working Time   Prototype
16 Final Presentation   Final Presentation

Details for Each Week

  • Week 1: Introduction to the history and theory of XR. Students will gain foundational skills in Unreal Engine’s UI and GitHub for version control.
  • Week 2: Focus on using VR for skill acquisition. Students will develop visualization skills.
  • Week 3: Create digital assets using generative AI tools and a free 3D asset library. By the end of the week, students will submit a concept as the deliverable.
  • Week 4: Introduction to storytelling in VR. Focus on interactive design and user interface.
  • Week 5: Explore topics of empathy and inclusion within VR environments.
  • Week 6: Potentially a flex week for independent work or exploring additional topics based on student needs.
  • Weeks 7-15: Dedicated time for project work, where students will focus on developing their prototypes.
  • Week 16: Final presentations, where students showcase their completed projects.

Last Year 2024 student projects

Reading List

Tang, Ming, Mikhail Nikolaenko, Evv Boerwinkle, Samuel Obafisoye, Aayush Kumar, Mohsen Rezayat, Sven Lehmann, and Tamara Lorenz. “Evaluation of the Effectiveness of Traditional Training vs. Immersive Training: A Case Study of Building Safety & Emergency Training.” Paper presented at the International Conference on eXtended Reality (XR SALENTO 2024), Lecce, Italy, September 4-9, 2024. The paper is published in the Springer Link proceeding book

Tang, M., Nored, J., Anthony, M., Eschmann, J., Williams, J., Dunseath, L. (2024). VR-Based Empathy Experience for Nonprofessional Caregiver Training. In: De Paolis, L.T., Arpaia, P., Sacco, M. (eds) Extended Reality. XR Salento 2024. Lecture Notes in Computer Science, vol 15028. Springer, Cham. https://doi.org/10.1007/978-3-031-71704-8_28 

Jong, Steffi de. “The Simulated Witness: Empathy and Embodiment in VR Experiences of Former Nazi Concentration and Extermination Camps.” History and Memory: Studies in Representation of the Past 35, no. 1 (March 22, 2023): COV4–COV4. https://muse.jhu.edu/pub/3/article/885269/pdf 

Yildirim, Caglar, and D. Fox Harrell. “On the Plane: A Roleplaying Game for Simulating Ingroup-Outgroup Biases in Virtual Reality.” In 2022 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), 207–9, 2022. https://doi.org/10.1109/AIVR56993.2022.00041.

Doyle, Denise. “Empathy VR and the BeAnotherLab Collective.” Virtual Creativity 10, no. 2 (December 1, 2020): 191–200. https://doi.org/10.1386/vcr_00032_1.

García-Betances, Rebeca I., María Teresa Arredondo Waldmeyer, Giuseppe Fico, and María Fernanda Cabrera-Umpiérrez. “A Succinct Overview of Virtual Reality Technology Use in Alzheimer’s Disease.” Frontiers in Aging Neuroscience 7 (May 12, 2015): 80. https://doi.org/10.3389/fnagi.2015.00080.

Gillespie, Ciaran. “Virtual Humanity—Access, Empathy and Objectivity in VR Film Making.” Global Society 34, no. 2 (April 2, 2020): 145–62. https://doi.org/10.1080/13600826.2019.1656173.

Ho-yin Lai, Frank. “Supporting People with Dementia and Their Caregivers’ Everyday Occupations through Home Hazard Identification and Virtual Reality-Based Training.” Alzheimer’s & Dementia 16, no. S6 (December 2020): e036163. https://doi.org/10.1002/alz.036163.

Mouri, Nene, Makoto Sasaki, Taichi Yagimaki, Marie Murakami, Kazuko Igari, and Keiichi Sasaki. “Development of a Training Simulator for Caregivers’ Toothbrushing Skill Using Virtual Reality.” Advanced Biomedical Engineering 12 (2023): 91–100. https://doi.org/10.14326/abe.12.91.

Russell, Vincent, Rachel Barry, and David Murphy. “HAVE Experience: An Investigation into VR Empathy for Panic Disorder.” In 2018 IEEE Games, Entertainment, Media Conference (GEM), 1–9, 2018. https://doi.org/10.1109/GEM.2018.8516461.

Salminen, Mikko, Simo Järvelä, Antti Ruonala, Ville J. Harjunen, Juho Hamari, Giulio Jacucci, and Niklas Ravaja. “Evoking Physiological Synchrony and Empathy Using Social VR With Biofeedback.” IEEE Transactions on Affective Computing 13, no. 2 (April 2022): 746–55. https://doi.org/10.1109/TAFFC.2019.2958657.

Trevena, Lee, Jeni Paay, and Rachael McDonald. “VR Interventions Aimed to Induce Empathy: A Scoping Review.” Virtual Reality 28, no. 2 (March 19, 2024): 80. https://doi.org/10.1007/s10055-024-00946-9.

Wang, Yanyun (Mia), Chen (Crystal) Chen, Michelle R Nelson, and Sela Sar. “Walk in My Shoes: How Perspective-Taking and VR Enhance Telepresence and Empathy in a Public Service Announcement for People Experiencing Homelessness.” New Media & Society 26, no. 7 (July 1, 2024): 3931–50. https://doi.org/10.1177/14614448221108108.

Gallery show at Blink

I’m excited to be part of a group of UC DAAP faculty and student works exhibited at the 2024 Cincinnati Blink. The theme of the exhibition is “Artistic Intelligence with AI.”

Thanks to the support from Midwestcon, DisruptionNowDAAP, and 1819 Innovation Hub.

October 18-20. Cincinnati. OH.

More AI + student work is available at my Spatial Computing. ARCH Design Studio

GRO

Virtual Reality Training on Issues of Youth Firearm Possession.

PI. Tang. $20,000. 8/5/2024-8/4/2025.

Funded by the God.Restoring.Order (GRO) Community, this research project will develop two VR scenarios that simulate environments designed to educate youth on applying critical skills in risky situations.

Team: Ming Tang, XR-Lab, Aaron Mallory, GRO.

The XR-Lab is excited to collaborate with the GRO community to leverage cutting-edge XR technologies to develop a virtual reality (VR) training app that enhances the curriculum by reinforcing key skills through immersive VR activities. Together, we will assess the feasibility of integrating VR technology into the GRO’s training program, engaging users with a compelling narrative while equipping them with practical knowledge for real-world application.

 

        


Redefining Space in the Age of Spatial Computing

ARCH 4001. Fall. 2024

Redefining Space in the Age of Spatial Computing 

Ming Tang
Professor, Registered Architect, NCARB, LEED AP, Director of Extended Reality Lab. UC Digital Futures. Leadership committee, Institute for Research in Sensing (IRiS);
School of Architecture and Interior Design, College of Design, Architecture, Art, and Planning, University of Cincinnati

In the age of spatial computing, architecture transcends the physical boundaries of bricks and mortar, evolving into an expansive realm that merges computational technologies. This studio embraces the transformative potential of AI in the architectural design process, expanding physical spaces into the metaverse and positioning architecture at the forefront of our society’s digital revolution. By serving as a platform for spatial experience, architecture redefines our interaction with space and each other. 

This studio invites students to collectively harness the power of AI to create virtual architectural spaces that inspire, connect, and elevate the human spatial experience. Students design a convention center to host various activities, from large conference events like SIGGRAPH and trade shows like CES to small lectures and social events. They research the spatial layout of a convention center and explore the possibilities of migrating these activities to a fully 3D virtual space in the metaverse. The studio also explores the intersections of AI, digital twin, and XR technologies with the metaverse, enabling students to push the boundaries of traditional architectural design. Through this exploration, students gain insights into how digital advancements can enhance and transform spatial experiences, fostering innovation and creativity in the architectural field.

Program

  • Design a large-scale virtual convention center that includes a large auditorium, exhibition space for trade shows, and various-sized meeting rooms  for event/daily use and collaboration.

Manifesto: Embracing Generative AI in Architecture

In this unprecedented era of generative AI and spatial computing, architecture has transcended beyond its historical limitations of bricks, steel, and mortar. It has evolved into an expansive realm where computation, artificial intelligence, and human creativity converge to shape new dimensions of space, both physical and virtual. This manifesto serves as a call to arms for architects, designers, and creators to embrace this transformative potential, pushing the boundaries of what architecture can be in a world defined by digital revolution.

Architecture is no longer a static entity. It is dynamic, fluid, and capable of adapting to the complexities of the modern world. Generative AI empowers us to reimagine the design process, enabling the creation of spaces that respond, learn, and evolve with their users. AI is not merely a tool; it is a collaborator—infusing intelligence into the heart of design, guiding form, function, and experience in ways that were once unimaginable.

In this new era, physical spaces no longer exist in isolation. They extend into the virtual, into the metaverse, where architecture is redefined as an experience rather than a mere structure. The integration of AI-driven generative systems allows for the seamless blending of these realms, giving rise to environments that are immersive, interactive, and deeply connected to the human condition. Architecture becomes a platform for communication, collaboration, and creativity—transcending time, space, and materiality.

This manifesto invites students, designers, and visionaries to harness the power of AI in the design process. Together, we will forge virtual and hybrid spaces that inspire, connect, and elevate the human experience. We will create architecture that is inclusive, responsive, and capable of transforming the way we interact with space and with each other. In doing so, we position architecture at the forefront of society’s digital transformation, ensuring that it remains not only relevant but essential to our collective future.

Generative AI is not just the future of architecture—it is the present. It is a movement, a paradigm shift that empowers us to craft spaces that are as intelligent as they are beautiful. Let us embrace this moment and seize the opportunity to redefine the built environment, both real and virtual, for generations to come.

Student projects

 

Reference: AI-assistant design workflow