paper in Architecture Journal

Metaverse and Digital Twins in the Age of AI and Extended Reality

Tang, Ming, Mikhail Nikolaenko, Ahmad Alrefai, and Aayush Kumar. 2025. “Metaverse and Digital Twins in the Age of AI and Extended Reality” Architecture 5, no. 2: 36. https://doi.org/10.3390/architecture5020036

This paper explores the evolving relationship between Digital Twins (DT) and the Metaverse, two foundational yet often conflated digital paradigms in digital architecture. While DTs function as mirrored models of real-world systems—integrating IoT, BIM, and real-time analytics to support decision-making—Metaverses are typically fictional, immersive, multi-user environments shaped by social, cultural, and speculative narratives. Through several research projects, the team investigate the divergence between DTs and Metaverses through the lens of their purpose, data structure, immersion, and interactivity, while highlighting areas of convergence driven by emerging technologies in Artificial Intelligence (AI) and Extended Reality (XR).This study aims to investigate the convergence of DTs and the Metaverse in digital architecture, examining how emerging technologies—such as AI, XR, and Large Language Models (LLMs)—are blurring their traditional boundaries. By analyzing their divergent purposes, data structures, and interactivity modes, as well as hybrid applications (e.g., data-integrated virtual environments and AI-driven collaboration), this study seeks to define the opportunities and challenges of this integration for architectural design, decision-making, and immersive user experiences. Our research spans multiple projects utilizing XR and AI to develop DT and the Metaverse. The team assess the capabilities of AI in DT environments, such as reality capture and smart building management. Concurrently, the team evaluates metaverse platforms for online collaboration and architectural education, focusing on features facilitating multi-user engagement. The paper presents evaluations of various virtual environment development pipelines, comparing traditional BIM+IoT workflows with novel approaches such as Gaussian Splatting and generative AI for content creation. The team further explores the integration of Large Language Models (LLMs) in both domains, such as virtual agents or LLM-powered Non-Player-Controlled Characters (NPC), enabling autonomous interaction and enhancing user engagement within spatial environments. Finally, the paper argues that DTs and Metaverse’s once-distinct boundaries are becoming increasingly porous. Hybrid digital spaces—such as virtual buildings with data-integrated twins and immersive, social metaverses—demonstrate this convergence. As digital environments mature, architects are uniquely positioned to shape these dual-purpose ecosystems, leveraging AI, XR, and spatial computing to fuse data-driven models with immersive and user-centered experiences.
 
Keywords:  metaverse; digital twin; extended reality; AI

AE studio 2025

Spring 2025. Instructors: Ming Tang, Samira Sarabandikachyani

Final Gallery Show. Room 7270, DAAP, 04/24. 2:30-4:30 pm. 

This studio provides a comprehensive introduction to architectural design. Students developed proposals for a restaurant located in Over-the-Rhine (OTR), Cincinnati, Ohio.

Infinite Loop

Exhibition “Views of Cincinnati & Ohio Valley”

Location: Elevar Gallery, 555 Carr St., Cincinnati.
opening hours: May 1- June 27, 2025, Mon-Thu 9-5 pm, Fri 9-6:30 pm

The exhibition is supported by the Creative Asian Society and the ArtsWave Impact grant.

“Infinite Loop” reflects my interpretation of the Ohio Valley—an ever-shifting landscape shaped by both deep geological time and layers of human history. Inspired by the region’s porous limestone caves, exposed rock formations, and the powerful erosive force of the Ohio River, the sculpture evokes the continuous movement and natural evolution embedded in this terrain. The form, looping without a clear beginning or end, draws from the valley’s complex strata—both literal and metaphorical. It echoes the industrial legacy of Cincinnati: a city built along railroads, powered by migration, and continually transformed by waves of innovation, creativity, and technology. Each undulating surface captures a sense of motion and continuity, speaking to the rhythms of the river and the resilience of a city in flux. By blending references to natural erosion, flood, and industrial infrastructure, Infinite Loop invites reflection on how we shape—and are shaped by—the landscapes we inhabit. It is a meditation on flow, transformation, and the unbroken cycles that define both the Ohio River and the city of Cincinnati itself.

360 video

The exhibition is supported by the Creative Asian Society and the ArtsWave Impact grant.

Project demonstrations: XR and Gen-AI Technologies in Design

 

  

Left: VR training on welding, Samantha Frickel.  Right: Cinematic universes. Carson Edwards

Extended Reality and Generative AI in Human-Centered Design

UHP + Architecture Seminar

Date: Monday, April 28, 2025
Time: 1:00 – 3:00 PM
Location: Second Floor Lobby, UC Digital Futures Building

You are cordially invited to the final presentations showcasing exemplary student work from the University of Cincinnati’s Honors Seminar and Architecture Design Seminar. These curated demonstrations showcase multiple innovative projects intersecting emerging technologies with human-centered design.The projects include a wide range of demonstrations in the following two categories: 

Future Environment
This group of projects explores imaginative and speculative environments through immersive technologies. Students and researchers have developed experiences ranging from fictional music spaces, virtual zoos, and animal shelters to emotionally responsive architectural designs and future cityscapes. These environments often incorporate interactive elements, such as Augmented Reality on mobile devices or real-time simulations of natural phenomena like flooding. Advanced material simulation is also a focus, including simulating cloth and other soft fabrics that respond dynamically to user interaction.

Training
The second category centers on Virtual Reality-based training applications designed to simulate real-world tasks and enhance learning through immersive experiences. These projects include simulations for welding, firefighter robotics, and driving and instructional environments such as baby car seat installation. Each scenario provides a controlled, repeatable setting for learners to gain confidence and skills in safety-critical and technical domains, demonstrating the practical potential of XR technologies in professional training and education.

These two interdisciplinary seminars investigate the application of Extended Reality (XR) technologies, including Virtual Reality (VR), and Augmented Reality (AR), in addressing real-world challenges. Students examined integrating human-computer interaction with immersive digital tools to create embodied, interactive experiences that enhance user engagement and understanding.

In parallel, the courses explored comprehensive design methodology—spanning research, ideation, prototyping, and evaluation—framed through the lens of generative AI and immersive virtual environments. Projects emphasized the role of AI-assisted content creation and immersive media in advancing human-centered design practices with either a fictional metaverse or reality-based digital twins. 

The student work presented reflects a research-driven approach to spatial design, focusing on how digital scenarios influence human perception, emotional response, and cognitive engagement. XR was explored as a medium for fostering empathy, delivering emotional impact, and enhancing the acquisition of knowledge and skills.

We invite you to engage in these forward-thinking investigations and celebrate the creativity and scholarly inquiry of our students.

 


Student Projects Gallery Show in 2024

SMART-DT

SMART-DT: Scalable Multi-Agent Reinforcement Learning and Collaborative AI for Digital Twin Platform of Infrastructure and Facility Operations.

Principal Investigators:

  • Prof. Sam Anand, Department of Mechanical Engineering, CEAS
  • Prof. Ming Tang, Extended Reality Lab, Digital Futures, DAAP

Grant: $40,000. UC Industry 4.0/5.0 Institute Consortium Research Project: 03.2025-01.2026

In this phase, we will develop a scalable digital twin that integrates machine, factory, and city-level data with AI-driven real-time decision-making. The key questions we aim to answer are:

  • Can a high-fidelity Digital Twin (DT) be efficiently built using only image and video data?
  • How can multiple specialized Large Language Model (LLM) agents—at machine, factory, and city levels—collaborate to generate relevant insights?
  • How effective is synthetic data from a Digital Twin for object detection and process recognition?
  • Does combining traditional Machine Learning (ML) with Large Language Models (LLMs) improve decision-making in complex manufacturing operations?

The project’s primary goal is to create a scalable, cloud-based digital twin that enhances operational efficiency through AI-driven insights. Additional technical objectives include:

  • Using advanced reality capture techniques (e.g., Gaussian Splatting) to build a Digital Twin from images and videos and simulate fault scenarios at factory and data center levels.
  • Integrating an IIoT data framework to track material flow, process handling, operational metrics, and equipment status for seamless cloud-based analysis.
  • Developing a synthetic data capture system using a simulated drone within the Digital Twin to train reinforcement learning models for fault prediction.
  • Designing a multi-agent AI system combining LLMs, machine learning, and reinforcement learning to enable dynamic communication, prediction, and diagnostics in the factory.

 

last year’s project: IIOT for legacy and intelligent factory machines with XR and LLM feedback with a Digital Twin demonstration of real-time IOT for architecture/building applications using Omniverse.