This paper explores the evolving relationship between Digital Twins (DT) and the Metaverse, two foundational yet often conflated digital paradigms in digital architecture. While DTs function as mirrored models of real-world systems—integrating IoT, BIM, and real-time analytics to support decision-making—Metaverses are typically fictional, immersive, multi-user environments shaped by social, cultural, and speculative narratives. Through several research projects, the team investigate the divergence between DTs and Metaverses through the lens of their purpose, data structure, immersion, and interactivity, while highlighting areas of convergence driven by emerging technologies in Artificial Intelligence (AI) and Extended Reality (XR).This study aims to investigate the convergence of DTs and the Metaverse in digital architecture, examining how emerging technologies—such as AI, XR, and Large Language Models (LLMs)—are blurring their traditional boundaries. By analyzing their divergent purposes, data structures, and interactivity modes, as well as hybrid applications (e.g., data-integrated virtual environments and AI-driven collaboration), this study seeks to define the opportunities and challenges of this integration for architectural design, decision-making, and immersive user experiences. Our research spans multiple projects utilizing XR and AI to develop DT and the Metaverse. The team assess the capabilities of AI in DT environments, such as reality capture and smart building management. Concurrently, the team evaluates metaverse platforms for online collaboration and architectural education, focusing on features facilitating multi-user engagement. The paper presents evaluations of various virtual environment development pipelines, comparing traditional BIM+IoT workflows with novel approaches such as Gaussian Splatting and generative AI for content creation. The team further explores the integration of Large Language Models (LLMs) in both domains, such as virtual agents or LLM-powered Non-Player-Controlled Characters (NPC), enabling autonomous interaction and enhancing user engagement within spatial environments. Finally, the paper argues that DTs and Metaverse’s once-distinct boundaries are becoming increasingly porous. Hybrid digital spaces—such as virtual buildings with data-integrated twins and immersive, social metaverses—demonstrate this convergence. As digital environments mature, architects are uniquely positioned to shape these dual-purpose ecosystems, leveraging AI, XR, and spatial computing to fuse data-driven models with immersive and user-centered experiences.
Keywords: metaverse; digital twin; extended reality; AI
Faculty: Ming Tang, Professor in Architecture and Interior Design, DAAP. Director of Extended Reality Lab. XR-Lab
Gen-AI with Stable Diffusion, by Ming Tang
In his article The Case for a Feeling Architecture and Architecture Is a Deeply Emotional Experience, Jacob DiCrescenzo emphasizes the importance of designing architecture that prioritizes human emotions over abstract concepts. He advocates for a paradigm shift toward emotional responsiveness in design, creating spaces that enrich occupants’ well-being and experience. By integrating insights from neuroscience, architects can craft impactful, memorable environments, harnessing emotion as a core element of human cognition to make buildings more meaningful and engaging. In Emotional Architecture, Arushi Malhotra describes emotional architecture as a two-fold interaction: cognitive, where users interpret space, and emotional, where they react to it. By carefully manipulating elements such as light, color, and sound, architects can create environments that foster positive emotional responses, thereby enhancing user experience and potentially influencing mood and productivity.
On the other hand, the architecture of illusion, as explored by Richard Haas [1] and M.C. Escher, and AI-generated images at Craiyon create a world where perspective and spatial perception are manipulated to challenge the viewer’s understanding of reality. These artists employ techniques that distort conventional architectural forms, constructing scenes that defy gravity, continuity, and logic. Their work invites viewers into spaces that feel simultaneously familiar and impossible, where each element appears coherent individually, yet together, they form an intricate, mind-bending puzzle. Through the use of optical illusions, they create a captivating experience that blurs the line between the real and the surreal.
Course Brief:
This seminar focuses on the intersection of emotional architecture, generative AI, and immersive virtual environments to understand how spatial design impacts human perception and emotional response. Students will leverage cutting-edge technologies to examine how elements like color, light, sound, and spatial layout shape neural responses, eye fixation patterns, and overall emotional experiences within architectural spaces.
Through AI-driven simulations and virtual reality, participants will study how specific architectural features influence occupants’ psychological and physiological reactions. Using eye-tracking technology and biometric feedback, students will gain valuable insights into user interactions and emotional responses to various spatial configurations, developing skills to design spaces that resonate emotionally and enhance human well-being.
This course will explore how architectural experiences can evoke and enhance a range of emotions, including happiness, sadness, fear, surprise, love, excitement, calmness, awe, confusion, trust, pride, relief, compassion, loneliness, and anticipation. Additionally, students will build technical skills in visualizing architectural spaces in VR and assessing emotional responses, preparing them to create impactful, emotion-centered environments.
Theories of emotional architecture and human-centered design.
Exploration of architectural drawings from Libbeus Woods, Zaha Hadid about styles and human emotions.
Applications of generative AI in creating adaptive, responsive spatial features
Use Unreal Engine to create virtual reality and immersive media for simulating and testing architectural impact
Techniques in eye-tracking and biometric measurement to assess engagement and emotional response
Analyzing the influence of spatial elements—color, light, sound, material textures—on mood and neural activity
Ethical considerations and implications of designing for emotional impact
Eye-tracking heatmap of CVG airport. Tobii VR.
Learning Outcomes: By the end of this course, students will:
Understand the foundational theories of emotional architecture and its relevance in contemporary design.
Apply generative AI tools to create and adapt architectural designs based on emotional and perceptual data.
Conduct virtual reality, eye-tracking, and biometrics experiments to gauge human responses to spatial features.
Critically analyze and discuss the emotional impact of environmental factors in architecture.
Produce a generative AI-based design project incorporating findings on human emotional responses to enhance user experience in virtual and real spaces.
Assessment:
Assessment will be based on a combination of participation in VR-based labs, a midterm report on emotional architecture theory, and a final project integrating generative AI with VR to present an emotionally engaging spatial design prototype.
Course Prerequisites:
This seminar is designed for advanced undergraduate or graduate students in architecture, design, or related fields. Familiarity with basic architectural design concepts and introductory experience in 3D modeling or AI is recommended but not required.
https://i0.wp.com/ming3d.com/new/wp-content/uploads/2024/11/emotionalarchitecture.jpg?fit=682%2C583583682Ming Tanghttp://ming3d.com/new/wp-content/uploads/2022/01/TY_logo-300x300-new.pngMing Tang2024-11-03 21:01:352025-04-02 17:47:45Emotional Architecture in the Digital Age
https://i2.wp.com/ming3d.com/new/wp-content/uploads/2022/03/XR_Cover.jpg?fit=1669%2C240024001669Ming Tanghttp://ming3d.com/new/wp-content/uploads/2022/01/TY_logo-300x300-new.pngMing Tang2022-03-16 17:19:102022-10-07 21:00:11Book: Design in the Age of Extended Reality
The objective of this study is to investigate the safety of roadway workers under varying environmental and work zone conditions. To achieve the objectives, a driving simulator-based experiment is proposed to evaluate drivers’ visual attention under various work zone scenarios using eye-tracking technologies.
Grant.
Using Eye- Tracking to Study the Effectiveness of Visual Communication. UHP Discovery funding. University Honor Program. UC. $5,000. Faculty advisor. 2021.
Adekunle Adebisi (Ph.D student at the College of Engineering and Applied Science) applied and received a $3,200 Emerging Fellowship Award By Academic Advisory Council for Signage Research and Education (AACSRE).
https://i1.wp.com/ming3d.com/new/wp-content/uploads/2021/08/ET_sinage.jpg?fit=662%2C455455662Ming Tanghttp://ming3d.com/new/wp-content/uploads/2022/01/TY_logo-300x300-new.pngMing Tang2021-08-28 23:47:322022-09-28 00:59:52Eye-Tracking for Drivers' Visual Behavior