SENSE: Spatial Experiences for Narrative and Sensory Emotions
Studio Overview
This design studio invites students to explore the complex relationship between architecture, human emotion, and experiential design through the conceptualization and design of a Museum of Emotion. Students will investigate how spatial design can evoke, mediate, and communicate emotional states—moving beyond functionality to create environments that resonate on a deep psychological and sensory level. By integrating neuroscience, art, culture, and digital technologies, students will develop speculative proposals for a museum that acts not only as a cultural institution but also as a space of introspection, empathy, and transformation.
Studio Objectives
Understand and interpret the spatial, sensory, and material qualities that influence human emotional responses.
Translate research on emotion into architectural language (form, light, material, scale, sequence, etc.).
Design immersive environments that express or evoke specific emotional states.
Engage interdisciplinary methods (AI, Extended Reality, digital media) to inform spatial experience.
Critically assess cultural, ethical, and therapeutic dimensions of designing for emotion.
Key Questions
How can architectural elements—light, space, materiality, proportion—evoke emotional responses?
What is the role of immersive and interactive technology (VR/AR, AI, biometric feedback) in shaping emotional experiences?
How do cultural, personal, and neurophysiological factors affect emotional perception of space?
How can architecture foster emotional literacy, empathy, and collective memory?
Program
Each student (or team) will design a Museum of Emotion on a site of their choice. The museum must include:
Core Zones (Required):
Emotion Lab: Interactive gallery presenting scientific and technological perspectives on emotion.
Rooms of Emotion: A minimum of three immersive emotional environments (e.g., joy, fear, sadness, awe, love, anger).
Memory Archive: A participatory or data-driven installation space where emotional memories are recorded, interpreted, and displayed.
Cultural Expressions Gallery: A rotating exhibition space focused on how different cultures represent and process emotions.
Optional Programs (Student-Defined):
Workshop or educational spaces
Performance or therapeutic spaces
Café or gathering zone
Outdoor sensory garden or emotional path
Design Tools and Methods
Precedent studies of museums, memorials, and immersive installations
Digital modeling and rendering (with emphasis on atmosphere and mood)
Use of AI-assisted simulations, AIGC, and VR walkthroughs
This paper explores the evolving relationship between Digital Twins (DT) and the Metaverse, two foundational yet often conflated digital paradigms in digital architecture. While DTs function as mirrored models of real-world systems—integrating IoT, BIM, and real-time analytics to support decision-making—Metaverses are typically fictional, immersive, multi-user environments shaped by social, cultural, and speculative narratives. Through several research projects, the team investigate the divergence between DTs and Metaverses through the lens of their purpose, data structure, immersion, and interactivity, while highlighting areas of convergence driven by emerging technologies in Artificial Intelligence (AI) and Extended Reality (XR).This study aims to investigate the convergence of DTs and the Metaverse in digital architecture, examining how emerging technologies—such as AI, XR, and Large Language Models (LLMs)—are blurring their traditional boundaries. By analyzing their divergent purposes, data structures, and interactivity modes, as well as hybrid applications (e.g., data-integrated virtual environments and AI-driven collaboration), this study seeks to define the opportunities and challenges of this integration for architectural design, decision-making, and immersive user experiences. Our research spans multiple projects utilizing XR and AI to develop DT and the Metaverse. The team assess the capabilities of AI in DT environments, such as reality capture and smart building management. Concurrently, the team evaluates metaverse platforms for online collaboration and architectural education, focusing on features facilitating multi-user engagement. The paper presents evaluations of various virtual environment development pipelines, comparing traditional BIM+IoT workflows with novel approaches such as Gaussian Splatting and generative AI for content creation. The team further explores the integration of Large Language Models (LLMs) in both domains, such as virtual agents or LLM-powered Non-Player-Controlled Characters (NPC), enabling autonomous interaction and enhancing user engagement within spatial environments. Finally, the paper argues that DTs and Metaverse’s once-distinct boundaries are becoming increasingly porous. Hybrid digital spaces—such as virtual buildings with data-integrated twins and immersive, social metaverses—demonstrate this convergence. As digital environments mature, architects are uniquely positioned to shape these dual-purpose ecosystems, leveraging AI, XR, and spatial computing to fuse data-driven models with immersive and user-centered experiences.
Keywords: metaverse; digital twin; extended reality; AI
High-Fidelity Digital Human Modeling and Visualization
This research investigates advanced methods for constructing high-fidelity digital humans, with a focus on both technical innovation and applied use in immersive environments. The project integrates generative artificial intelligence, image-based modeling, and visualization pipelines to advance realism, interactivity, and usability in human–computer interaction.
Aim 1. Conversational Digital Agents Driven by Large Language Models (LLMs). The first aim is to utilize large language models (LLMs) as the core engine for conversational digital humans. By embedding LLM-driven reasoning into virtual agents, the project seeks to create responsive, adaptive, and context-aware “talking agents.” These agents will simulate naturalistic dialogue, provide interactive guidance, and adapt to user needs across diverse scenarios such as education, healthcare training, and collaborative design.
Aim 2. Photorealistic Skin and Visual Fidelity Through Scanned Data. The second aim focuses on the visual accuracy of digital humans. High-resolution image scans will be processed to reconstruct human skin with detailed fidelity, including surface textures, translucency, and micro-geometric variations. The resulting models are capable of 4K photorealistic rendering (click image to view sample output), significantly enhancing realism in simulation and visualization. This fidelity is crucial for applications where nuanced perception—such as empathy, trust, or attentiveness—depends on subtle visual cues.
Significance. By combining intelligent conversational capabilities with photorealistic appearance, this research advances the next generation of digital humans. The outcomes will support applications in extended reality (XR), therapeutic and clinical training, collaborative design education, and digital twin environments, where authenticity of both interaction and appearance directly influences user engagement and effectiveness
Faculty: Ming Tang, Professor in Architecture and Interior Design, DAAP. Director of Extended Reality Lab. XR-Lab
Gen-AI with Stable Diffusion, by Ming Tang
In his article The Case for a Feeling Architecture and Architecture Is a Deeply Emotional Experience, Jacob DiCrescenzo emphasizes the importance of designing architecture that prioritizes human emotions over abstract concepts. He advocates for a paradigm shift toward emotional responsiveness in design, creating spaces that enrich occupants’ well-being and experience. By integrating insights from neuroscience, architects can craft impactful, memorable environments, harnessing emotion as a core element of human cognition to make buildings more meaningful and engaging. In Emotional Architecture, Arushi Malhotra describes emotional architecture as a two-fold interaction: cognitive, where users interpret space, and emotional, where they react to it. By carefully manipulating elements such as light, color, and sound, architects can create environments that foster positive emotional responses, thereby enhancing user experience and potentially influencing mood and productivity.
On the other hand, the architecture of illusion, as explored by Richard Haas [1] and M.C. Escher, and AI-generated images at Craiyon create a world where perspective and spatial perception are manipulated to challenge the viewer’s understanding of reality. These artists employ techniques that distort conventional architectural forms, constructing scenes that defy gravity, continuity, and logic. Their work invites viewers into spaces that feel simultaneously familiar and impossible, where each element appears coherent individually, yet together, they form an intricate, mind-bending puzzle. Through the use of optical illusions, they create a captivating experience that blurs the line between the real and the surreal.
Course Brief:
This seminar focuses on the intersection of emotional architecture, generative AI, and immersive virtual environments to understand how spatial design impacts human perception and emotional response. Students will leverage cutting-edge technologies to examine how elements like color, light, sound, and spatial layout shape neural responses, eye fixation patterns, and overall emotional experiences within architectural spaces.
Through AI-driven simulations and virtual reality, participants will study how specific architectural features influence occupants’ psychological and physiological reactions. Using eye-tracking technology and biometric feedback, students will gain valuable insights into user interactions and emotional responses to various spatial configurations, developing skills to design spaces that resonate emotionally and enhance human well-being.
This course will explore how architectural experiences can evoke and enhance a range of emotions, including happiness, sadness, fear, surprise, love, excitement, calmness, awe, confusion, trust, pride, relief, compassion, loneliness, and anticipation. Additionally, students will build technical skills in visualizing architectural spaces in VR and assessing emotional responses, preparing them to create impactful, emotion-centered environments.
Theories of emotional architecture and human-centered design.
Exploration of architectural drawings from Libbeus Woods, Zaha Hadid about styles and human emotions.
Applications of generative AI in creating adaptive, responsive spatial features
Use Unreal Engine to create virtual reality and immersive media for simulating and testing architectural impact
Techniques in eye-tracking and biometric measurement to assess engagement and emotional response
Analyzing the influence of spatial elements—color, light, sound, material textures—on mood and neural activity
Ethical considerations and implications of designing for emotional impact
Eye-tracking heatmap of CVG airport. Tobii VR.
Learning Outcomes: By the end of this course, students will:
Understand the foundational theories of emotional architecture and its relevance in contemporary design.
Apply generative AI tools to create and adapt architectural designs based on emotional and perceptual data.
Conduct virtual reality, eye-tracking, and biometrics experiments to gauge human responses to spatial features.
Critically analyze and discuss the emotional impact of environmental factors in architecture.
Produce a generative AI-based design project incorporating findings on human emotional responses to enhance user experience in virtual and real spaces.
Assessment:
Assessment will be based on a combination of participation in VR-based labs, a midterm report on emotional architecture theory, and a final project integrating generative AI with VR to present an emotionally engaging spatial design prototype.
Course Prerequisites:
This seminar is designed for advanced undergraduate or graduate students in architecture, design, or related fields. Familiarity with basic architectural design concepts and introductory experience in 3D modeling or AI is recommended but not required.
https://i0.wp.com/ming3d.com/new/wp-content/uploads/2024/11/emotionalarchitecture.jpg?fit=682%2C583&ssl=1583682Ming Tanghttp://ming3d.com/new/wp-content/uploads/2022/01/TY_logo-300x300-new.pngMing Tang2024-11-03 21:01:352025-04-02 17:47:45Emotional Architecture in the Digital Age