paper in Architecture Journal

Metaverse and Digital Twins in the Age of AI and Extended Reality

Tang, Ming, Mikhail Nikolaenko, Ahmad Alrefai, and Aayush Kumar. 2025. “Metaverse and Digital Twins in the Age of AI and Extended Reality” Architecture 5, no. 2: 36. https://doi.org/10.3390/architecture5020036

This paper explores the evolving relationship between Digital Twins (DT) and the Metaverse, two foundational yet often conflated digital paradigms in digital architecture. While DTs function as mirrored models of real-world systems—integrating IoT, BIM, and real-time analytics to support decision-making—Metaverses are typically fictional, immersive, multi-user environments shaped by social, cultural, and speculative narratives. Through several research projects, the team investigate the divergence between DTs and Metaverses through the lens of their purpose, data structure, immersion, and interactivity, while highlighting areas of convergence driven by emerging technologies in Artificial Intelligence (AI) and Extended Reality (XR).This study aims to investigate the convergence of DTs and the Metaverse in digital architecture, examining how emerging technologies—such as AI, XR, and Large Language Models (LLMs)—are blurring their traditional boundaries. By analyzing their divergent purposes, data structures, and interactivity modes, as well as hybrid applications (e.g., data-integrated virtual environments and AI-driven collaboration), this study seeks to define the opportunities and challenges of this integration for architectural design, decision-making, and immersive user experiences. Our research spans multiple projects utilizing XR and AI to develop DT and the Metaverse. The team assess the capabilities of AI in DT environments, such as reality capture and smart building management. Concurrently, the team evaluates metaverse platforms for online collaboration and architectural education, focusing on features facilitating multi-user engagement. The paper presents evaluations of various virtual environment development pipelines, comparing traditional BIM+IoT workflows with novel approaches such as Gaussian Splatting and generative AI for content creation. The team further explores the integration of Large Language Models (LLMs) in both domains, such as virtual agents or LLM-powered Non-Player-Controlled Characters (NPC), enabling autonomous interaction and enhancing user engagement within spatial environments. Finally, the paper argues that DTs and Metaverse’s once-distinct boundaries are becoming increasingly porous. Hybrid digital spaces—such as virtual buildings with data-integrated twins and immersive, social metaverses—demonstrate this convergence. As digital environments mature, architects are uniquely positioned to shape these dual-purpose ecosystems, leveraging AI, XR, and spatial computing to fuse data-driven models with immersive and user-centered experiences.
 
Keywords:  metaverse; digital twin; extended reality; AI

paper on XR conference

Two papers were presented and published at the 2024 International Conference on eXtended Reality. XR Salento 2024.

Tang, Ming, Mikhail Nikolaenko, Evv Boerwinkle, Samuel Obafisoye, Aayush Kumar, Mohsen Rezayat, Sven Lehmann, and Tamara Lorenz. “Evaluation of the Effectiveness of Traditional Training vs. Immersive Training: A Case Study of Building Safety & Emergency Training.” Paper presented at the International Conference on eXtended Reality (XR SALENTO 2024), Lecce, Italy, September 4-9, 2024. The paper is published in the Springer Link proceeding book

Virtual Reality (VR) has revolutionized training across healthcare, manufacturing, and service sectors by offering realistic simulations that enhance engagement and knowledge retention. However, assessments that allow for evaluation of the effectiveness of VR training are still sparse. Therefore, we examine VR’s effectiveness in emergency preparedness and building safety, comparing it to traditional training methods. The goal is to evaluate the impact of the unique opportunities VR enables on skill and knowledge development, using digital replicas of building layouts for immersive training experiences. To that end, the research evaluates VR training’s advantages and develops performance metrics by comparing virtual performance with actions in physical reality, using wearable tech for performance data collection and surveys for insights. Participants, split into VR and online groups, underwent a virtual fire drill to test emergency response skills. Findings indicate that VR training boosts urgency and realism perception despite similar knowledge and skill acquisition after more traditional lecture-style training. VR participants reported higher stress and greater effectiveness, highlighting VR’s immersive benefits. The study supports previous notions of VR’s potential in training while also emphasizing the need for careful consideration of its cognitive load and technological demands.

 

Tang, M., Nored, J., Anthony, M., Eschmann, J., Williams, J., Dunseath, L. (2024). VR-Based Empathy Experience for Nonprofessional Caregiver Training. In: De Paolis, L.T., Arpaia, P., Sacco, M. (eds) Extended Reality. XR Salento 2024. Lecture Notes in Computer Science, vol 15028. Springer, Cham. https://doi.org/10.1007/978-3-031-71704-8_28 

This paper presents the development of a virtual reality (VR) system designed to simulate various caregiver training scenarios, with the aim of fostering empathy by providing visual and emotional representations of the caregiver’s experience. The COVID-19 pandemic has increased the need for family members to assume caregiving roles, particularly for older adults who are at high risk for severe complications and death. This has led to a significant reduction in the availability of qualified home health workers. More than six million people aged 65 and older require long-term care, and two-thirds of these individuals receive all their care exclusively from family caregivers. Many caregivers are unprepared for the physical and emotional demands of caregiving, often exhibiting clinical signs of depression and higher stress levels.

The VR system, EVRTalk, developed by a multi-institutional team, addresses this gap by providing immersive training experiences. It incorporates theories of empathy and enables caregivers to switch roles with care recipients, navigating common scenarios such as medication management, hallucinations, incontinence, end-of-life conversations, and caregiver burnout. Research demonstrates that VR can enhance empathy, understanding, and communication skills among caregivers. The development process included creating believable virtual characters and interactive scenarios to foster empathy and improve caregiving practices. Initial evaluations using surveys showed positive feedback, indicating that VR training can reduce stress and anxiety for caregivers and improve care quality.

Future steps involve using biofeedback to measure physiological responses and further investigating the ethical implications of VR in caregiving training. The ultimate goal is to deploy VR training in homes, providing family caregivers with the tools and knowledge to manage caregiving responsibilities more effectively, thereby enhancing the quality of life for both caregivers and care recipients.

 

GenAI+AR Siemens

Automatic Scene Creation for Augmented Reality Work Instructions Using Generative AI. Siemens. PI. Ming Tang. co-PI: Tianyu Jiang. $25,000. UC. 4/1/2024-12/31/2024

Students: Aayush Kumar, Mikhail Nikolaenko, Dylan Hutson.

Sponsor: Siemens through UC MME Industry 4.0/5.0 Institute

Investigate integration of LLM Gen-AI with Hololens-based training. 

P&G Metaverse

Title: Leveraging Metaverse Platforms for Enhanced Global Professional Networking – Phase 1

Ming Tang, Principal Investigator

Amount: $32,416

Funding: P&G Digital Accelerator.

Gotal: Metaverse technologies for global engagement. Human-Computer Interaction, Digital Human. Immersive visualization. 

  • P&G Team. : Elsie Urdaneta, Sam Azeba. Paula Saldarriaga, JinnyLe HongOanh
  • UC Team: Ming Tang, Students: Nathaniel Brunner, Ahmad Alrefai, Sid Urankar

In Phase 1 of the “Leveraging Metaverse Platforms for Enhanced Global Professional Networking” project, conducted in partnership with P&G, the XR-Lab proposes a pilot study aimed at identifying and analyzing the top five metaverse platforms best suited for professional networking and conferencing. 

Example of XR-Lab in Hyperspace

paper SpaceXR in HCI 2024

SpaceXR: Virtual Reality and Data Mining for Astronomical Visualization ” is published at the 26th HCI International Conference. Proceeding Book.  Washington DC, USA. 29 June – 4 July 2024
Authors: Mikhail Nikolaenko, Ming Tang

 

Abstract

This paper presents a ” SpaceXR ” project that integrates data science, astronomy, and Virtual Reality (VR) technology to deliver an immersive and interactive educational tool. It is designed to cater to a diverse audience, including students, academics, space enthusiasts, and professionals, offering an easily accessible platform through VR headsets. This VR application offers a data-driven representation of celestial bodies, including planets and the sun within our solar system, guided by data from the NASA and Gaia databases. The VR application empowers users with interactive capabilities encompassing scaling, time manipulation, and object highlighting. The potential applications span from elementary educational contexts, such as teaching the star system in astronomy courses, to advanced astronomical research scenarios, like analyzing spectral data of celestial objects identified by Gaia and NASA. By adhering to emerging software development practices and employing a variety of conceptual frameworks, this project yields a fully immersive, precise, and user-friendly 3D VR application that relies on a real, publicly available database to map celestial objects. 

Check more project details on Solar Systems in VR. 

 

Mikhail Nikolaenko presented the paper at the 26th HCI International Conference. Washington DC, USA. 29 June – 4 July 2024