Posts

paper on AI, XR, Metaverse, Digital Twins

Metaverse and Digital Twins in the Age of AI and Extended Reality

Tang, Ming, Mikhail Nikolaenko, Ahmad Alrefai, and Aayush Kumar. 2025. “Metaverse and Digital Twins in the Age of AI and Extended Reality” Architecture 5, no. 2: 36. https://doi.org/10.3390/architecture5020036

 

This paper explores the evolving relationship between Digital Twins (DT) and the Metaverse, two foundational yet often conflated digital paradigms in digital architecture. While DTs function as mirrored models of real-world systems—integrating IoT, BIM, and real-time analytics to support decision-making—Metaverses are typically fictional, immersive, multi-user environments shaped by social, cultural, and speculative narratives. Through several research projects, the team investigate the divergence between DTs and Metaverses through the lens of their purpose, data structure, immersion, and interactivity, while highlighting areas of convergence driven by emerging technologies in Artificial Intelligence (AI) and Extended Reality (XR).This study aims to investigate the convergence of DTs and the Metaverse in digital architecture, examining how emerging technologies—such as AI, XR, and Large Language Models (LLMs)—are blurring their traditional boundaries. By analyzing their divergent purposes, data structures, and interactivity modes, as well as hybrid applications (e.g., data-integrated virtual environments and AI-driven collaboration), this study seeks to define the opportunities and challenges of this integration for architectural design, decision-making, and immersive user experiences. Our research spans multiple projects utilizing XR and AI to develop DT and the Metaverse. The team assess the capabilities of AI in DT environments, such as reality capture and smart building management. Concurrently, the team evaluates metaverse platforms for online collaboration and architectural education, focusing on features facilitating multi-user engagement. The paper presents evaluations of various virtual environment development pipelines, comparing traditional BIM+IoT workflows with novel approaches such as Gaussian Splatting and generative AI for content creation. The team further explores the integration of Large Language Models (LLMs) in both domains, such as virtual agents or LLM-powered Non-Player-Controlled Characters (NPC), enabling autonomous interaction and enhancing user engagement within spatial environments. Finally, the paper argues that DTs and Metaverse’s once-distinct boundaries are becoming increasingly porous. Hybrid digital spaces—such as virtual buildings with data-integrated twins and immersive, social metaverses—demonstrate this convergence. As digital environments mature, architects are uniquely positioned to shape these dual-purpose ecosystems, leveraging AI, XR, and spatial computing to fuse data-driven models with immersive and user-centered experiences.
 
Keywords:  metaverse; digital twin; extended reality; AI

XR and Gen-AI Technologies in Design

 

  

Left: VR training on welding, Samantha Frickel.  Right: Cinematic universes. Carson Edwards

Extended Reality and Generative-AI in Human-Centered Design

UHP + Architecture Seminar

Student work from the University of Cincinnati’s Honors Seminar and Architecture Design Seminar. This video showcases multiple innovative projects intersecting emerging technologies such has AIGC, XR with human-centered design.The projects include a wide range of demonstrations in the following two categories: 

Training
The first category centers on Virtual Reality-based training applications designed to simulate real-world tasks and enhance learning through immersive experiences. These projects include simulations for welding, firefighter robotics, and driving and instructional environments such as baby car seat installation. Each scenario provides a controlled, repeatable setting for learners to gain confidence and skills in safety-critical and technical domains, demonstrating the practical potential of XR technologies in professional training and education. Digital 3D content creation was augmented by various AIGC tools such as Rodin, Meshy, Tripo, etc.

Future Environment
This group of projects explores imaginative and speculative environments through immersive technologies. Students and researchers have developed experiences ranging from fictional music spaces, virtual zoos, and animal shelters to emotionally responsive architectural designs and future cityscapes. These environments often incorporate interactive elements, such as Augmented Reality on mobile devices or real-time simulations of natural phenomena like flooding. Advanced material simulation is also a focus, including simulating cloth and other soft fabrics that respond dynamically to user interaction. 2D Content creation was augmented by various AIGC tools such as Midjourney, Stable Diffusion, etc.

These two interdisciplinary seminars investigate the application of Extended Reality (XR) technologies, including Virtual Reality (VR), and Augmented Reality (AR), in addressing real-world challenges. Students examined integrating human-computer interaction with immersive digital tools to create embodied, interactive experiences that enhance user engagement and understanding.

In parallel, the courses explored comprehensive design methodology—spanning research, ideation, prototyping, and evaluation—framed through the lens of generative AI and immersive virtual environments. Projects emphasized the role of AI-assisted content creation and immersive media in advancing human-centered design practices with either a fictional metaverse or reality-based digital twins. 

The student work presented reflects a research-driven approach to spatial design, focusing on how digital scenarios influence human perception, emotional response, and cognitive engagement. XR was explored as a medium for fostering empathy, delivering emotional impact, and enhancing the acquisition of knowledge and skills.

Credit: UHP students: Amanda Elizabeth, Logan Daugherty, Valerie Dreith, Samantha Frickel, Aakash Jeyakanthan, Aayush Kumar, TJ Mueller, Rohit Ramesh, Megan Sheth, Ayush Verma.; Architecture students: Brady Bolton, Erik Mathys, Keai Perdue, Gustavo Reyes, Maria Vincenti, Nikunj Deshpande, Carson Edwards, Bhaskar Kalita, Sreya Killamshetty, Japneet Kour, Gaurang Pawar, Shruthi Sundararajan.

 


Student Projects Gallery Show in 2024

NCBDS conference

Paper “Designing the Future of Retail: Cross-Disciplinary Collaboration in Industrial Design and Architecture Design” published at the 40th National Conference on Begining Design Students Conference proceedings.  North Carolina State University. Raleigh, NC. 2025.

Yong-Gyun Ghim, Ming Tang, University of Cincinnati

 

Abstract

The significance of design’s cross-disciplinary nature has increased alongside technological advancements as emerging technologies present new opportunities and challenges for complex socio-technical systems. Systems thinking has drawn attention to design as a holistic approach to tackling complex systems by examining the interrelationships between elements. This also necessitates cross-disciplinary collaboration to address the multifaceted nature of the problems comprehensively. These aspects of systems thinking further emphasize its importance in design education to help navigate the current era of technological innovation. The future of retail exemplifies this interconnected complexity in the context of emerging technologies because introducing them – such as robotics, artificial intelligence, and mixed reality – into retail environments requires a holistic consideration of the entire system, encompassing physical spaces, service processes, and human interactions.

This study examines a 15-week collaborative studio project between industrial design and architecture. By leveraging a systems thinking approach, the project facilitated cross-disciplinary collaboration to develop future retail concepts, enabling students to integrate their expertise and address the interconnectedness of artifacts, environments, and human interactions. Both disciplines followed a structured design process encompassing research, system design, space and robot design, visualization, and validation, while collaboration was organized around four key steps: planning, learning, prototyping, and communication. The project also involved collaboration with a supermarket chain, providing opportunities for onsite observations, employee interviews, and discussions with industry professionals. Students developed futuristic concepts for retail operations and customer experiences by leveraging the integration of mobile service robots, adaptive spaces, and mixed reality. Industrial design students focused on designing a product-service system of supermarket robots based on their redefinition of customer shopping experience and employee workflow, proposing an automated grocery order fulfillment system. Architecture students designed adaptive retail spaces that seamlessly blur the boundaries between physical and digital worlds, exploring how the Metaverse and mixed-reality interfaces can augment retail spaces and shopping experiences through dynamic, immersive interactions with digital avatars and robots. This cross-disciplinary collaboration resulted in holistic and integrative solutions for complex systems, presented through immersive VR experiences or animated scenarios.

This study’s contribution to design education is threefold. First, it proposes a systems thinking approach with cross-disciplinary collaboration for designing future retail experiences, demonstrating its effectiveness in addressing and designing complex socio-technical systems. Second, it offers insights into how industrial design and architecture can be integrated to create novel user experiences in digital transformation. Lastly, by examining the design and collaboration processes and reflecting on the opportunities and challenges, this study offers insights for its application to future studio courses. Given the increased complexity and dynamics between disciplines, thorough pre-planning and flexibility are critical for success.

Keywords:

Cross-disciplinary collaboration, Design education, Industrial design, Architecture, Future of retail

Project:  Future Service, Retail, Metaverse, and Robotics

 

GenAI+AR Siemens

Automatic Scene Creation for Augmented Reality Work Instructions Using Generative AI. Siemens. PI. Ming Tang. co-PI: Tianyu Jiang. $25,000. UC. 4/1/2024-12/31/2024

Students: Aayush Kumar, Mikhail Nikolaenko, Dylan Hutson.

Sponsor: Siemens through UC MME Industry 4.0/5.0 Institute

Investigate integration of LLM Gen-AI with Hololens-based training.