Metaverse: Gen-AI + WebXR

ARCH 7014. Fall 2025. DAAP, UC.

This semester’s VIZ-3 course takes a deep dive into Generative AI-driven Metaverse design, building on last year’s explorations in AI-generated design, digital heritage, virtual reality, and digital twins. Students are working in small teams to conceptualize and build original virtual worlds using generative AI, reality capture, computational modeling, and immersive visualization tools. Although not a studio course, VIZ-3 mirrors a studio-like workflow through a research-driven sequence of research, ideation, visualization, and validation. Throughout the semester, students investigate how emerging AI tools and WebXR tools, such as Viverse, reshape the way virtual spaces are imagined, created, and experienced. Final projects are deployed both on the browser-based Viverse platform and through VR headsets supported by the XR-Lab.

“Metaverses are typically fictional, immersive, multi-user environments shaped by social, cultural, and speculative narratives”[1]. The course encourages students to consider the future of AI and the Metaverse while developing flexible, human-centered approaches to the design of virtual environments. Teams tailor their project scopes based on their interests—ranging from digital heritage and gaming environments to architectural, interior, and narrative-driven virtual spaces. The semester begins with an intensive design sprint as part of the Viverse Spark Global University Challenge, where students propose their own Metaverse environments and define their conceptual direction. This early research phase sets the foundation for deeper inquiry into social interaction in virtual environments and the evolving role of AI in shaping next-generation spatial experiences.

Eight metaverses

AI museum
• Treehouse
• Restaurant
• Underwater world
• The Forgotten Line: Cincinnati Subway
• ManBat Maze
• OTR district
• Four corners

Students: Lubna Shehata, Gustavo Reyes, Sarah Adams, Katie Hammerle, Katie Nefedov, Lindsey Rasberry, Taylor Flanagan, Olivia Corp, Dami Fajemisin, Gabrielle Ragusa, Camryn Lansdell, Liv Adkins, Bekah Selent, Morgan Pascarella, Julia Carter, Madisen Kleinmann, Austin Knupp, Sam Kazmaier, Amith Sai Valsalan, Jaymond Crayton, Sarah Coviello, Lauren Wrinkler, Vincent (Ty) Lucas, Kyra Christie, Journee Adams, Dylan Griffis, Zach Brunotts, Bailey Vianello, Emmanuel Dube, Eian Bennett, David David, Corban McIntosh, dharma patel, Erik Mathys, Darren Dong, Christian Drummond, Rian Klein, Sten Shuman.
TA: Semere Araha.
Thanks to Sangyong Cho, Heekyoung Jung, Hank McLendon, Joori Suh, and Henry Levesque for their excellent guest lectures and the inspiring research and project demonstrations!

[1] Tang, Ming, Mikhail Nikolaenko, Ahmad Alrefai, and Aayush Kumar. 2025. “Metaverse and Digital Twins in the Age of AI and Extended Reality” Architecture 5, no. 2: 36. https://doi.org/10.3390/architecture5020036

ARCH Seminar. Memory of the World

Digital Heritage through VR and Generative AI

ARCH 7036-04/ARCH5051-04.Elective Theory Seminar, SAID, DAAP, Spring 2025

Class time: Monday. 10 am -12:50 pm.  Classroom: 4425-E, CGC Lab, DAAP building. 

Faculty: Ming Tang, Professor in Architecture and Interior Design, DAAP. Director of  Extended Reality Lab. XR-Lab

 

Seminar Description
This seminar invites students to explore the intersection of architecture, history, artificial intelligence, and immersive technology to reimagine the past through digital heritage. Using historic documents and archives from Cincinnati as reference, students will collaborate with historians to construct one Cincinnati street—enabling users to travel back in time and experience its stories through real-time visualization and extended reality (XR).

Throughout the semester, students will investigate how generative AI, virtual reality (VR), and interactive visualization can preserve and reinterpret cultural memory. Using Unreal Engine as the primary platform, participants will design interactive VR environments for mobile headsets, creating spaces where history, atmosphere, and narrative merge into immersive experiences. The seminar aligns with UNESCO’s  Memory of the World initiative, emphasizing the preservation of documentary and architectural heritage for future generations.

In addition to a large collaborative group project, each student will conduct a samll individual research-based design investigation focused on a “lost” historic artifact—such as a forgotten art work or street furniture. Through AI-assisted modeling, reality capture, and digital prototyping, students will gain hands-on experience reconstructing the intangible layers of history while developing advanced technical and conceptual skills in digital heritage creation.

Skills covered: Unreal Engine 5, Generative AI tools, Immersive VR development for Meta Quest

Learning Outcomes
By the end of this course, students will be able to:

  • Explain how digital media shapes collective memory, fosters cultural understanding, and expands access to heritage across time and space.
  • Employ generative AI tools to create and refine architectural designs informed by reference data and contextual analysis.
  • Design and integrate high-fidelity virtual reality experiences and audio narratives to enhance spatial storytelling, deployed in the Meta Quest VR headset.
  • Critically analyze and articulate the workflows, challenges, and methodologies involved in digitally reconstructing historic architecture.
  • Develop a generative AI-assisted design project that augments and streamlines the 3D modeling process.

 

By Caroline McCarthy, DAAP, UC. 2024. Unreal Engine. 

Week 1 – Introduction: Digital Heritage + Course Overview

  • Lecture: Reimagining History through XR and AI
  • Discussion: Historic Cincinnati streets
  • Workshop: Overview of XR technologies, digital twins, and VR hardware (Quest 3 setup)
  • Assignment:  Group organization and individual task

Week 2 – Unreal Engine Foundations

  • Demo: Unreal Engine 5 interface, navigation, and project setup
  • Workshop: Basic environment creation, importing geometry, lighting & material setup
  • Lab: Create a small “street scene” with period references

Week 3 – 3D Assets & Historical References

  • Lecture: Historic Cincinnati
  • Workshop: Importing and optimizing 3D models (AI-generated meshes)
  • Lab: Build the base environment of a Cincinnati street block
  • Assignment: Collect historical images and create a 3D reference board

Week 4 – Generative AI for Architecture

  • Lecture: AI as a Co-Designer in Heritage Reconstruction
  • Demo: Nonbanana + Image/3D AI workflows for architectural texture and concept generation
  • Workshop: AI-to-Unreal pipeline — converting generated images into materials and assets
  • Lab: Generate building facades and signage using AI tools

Week 5 – Advanced Unreal Workflows

  • Lecture: Lighting, Atmosphere, and Period Reconstruction
  • Workshop: Mastering Lighting, Decal, and Quixel Megascans
  • Lab: Create environmental lighting that matches historical mood (day/night, fog, etc.)

Week 6 – Interactivity and Blueprints

  • Lecture: From Scene to Experience – Building Interaction in Unreal
  • Workshop: Unreal Blueprints for triggers, movement, and simple interactions
  • Lab: Create an interactive object or trigger zone (e.g., play sound)

Week 7 – Midterm Critique

  • In-class: Presentation of progress on group and individual projects
  • Peer + Instructor Feedback on historical accuracy, visual quality, and interactivity
  • Workshop: Optimize scenes for Quest VR performance

Week 8 – Audio, Narrative & Spatial Storytelling

  • Lecture: Sound as Memory – Integrating Audio Narratives
  • Workshop: Implement 3D spatial audio and voice narration in Unreal
  • Lab: Add sound cues tied to specific historical events or locations

Week 9 – VR Deployment for Quest

  • Workshop: Unreal project packaging and optimization for Meta Quest 3
  • Lab: Build and test interactive VR scenes on headsets
  • Debug session: Frame rate, lighting, and control issues
  • Milestone: Playable VR prototype

Week 10 -14 – working time

  • Workshop: Optimize lighting and build levels for immersive storytelling
  • Check-in: Instructor reviews individual and group progress
  • Lab: Full-scale integration – environment, audio, and narrative flow

Week 15 – Final Exhibition & Reflection

  • Public or in-class VR showcase: Historic Cincinnati in Virtual Reality
  • Final critique and documentation

Reference

Previos courses taught using Unreal Engine for VR

Publications

Paper: VR Training to De-escalate Patient Aggressive Behavior

Journal Paper: Virtual Reality Training to De-escalate Patient Aggressive Behavior: A Pilot Study

Daraiseh, N. M., Tang, M., Macaluso, M., Aeschbury, M., Bachtel, A., Nikolaenko, M., … Vaughn, A. (2025). Virtual Reality Training to De-escalate Patient Aggressive Behavior: A Pilot StudyInternational Journal of Human–Computer Interaction, 1–16. https://doi.org/10.1080/10447318.2025.2576635

Abstract
Despite intensive crisis de-escalation training, psychiatric staff continue to face high injury rates from aggressive patient interactions (APIs). New approaches are needed to enhance the application of effective strategies in managing APIs. This study explored the efficacy and feasibility of VR training for psychiatric staff in recognizing and selecting appropriate de-escalation interventions. A quasi-experimental design with psychiatric staff (N = 33) tested the effectiveness and feasibility of VR training depicting a common API interaction. Effectiveness was assessed through pre-post comparisons of the Confidence in Coping with Patient Aggression (CCPA) survey, correct answer percentages, response times, and attempt success rates. Feasibility was indicated by mean scores above ‘neutral’ on usability, presence, and learner satisfaction surveys. Results showed significant improvements in response times and confidence (p<.0001), with over 75% of participants rating the training positively. VR training is effective and feasible for enhancing de-escalation skills, offering a promising approach for psychiatric facilities.

More information on the project Therapeutic Crisis Intervention Simulation. P1,P2

XR-Lab moved


As the XR-Lab continues to grow and welcome more talented students, I’m thrilled to announce that we’ve officially moved into our new home — Suite 320 in the Digital Futures Building!

We’re deeply grateful for the incredible support from the University of Cincinnati in making this transition possible. This new space will allow us to expand our research, collaboration, and innovation in immersive technologies.

We warmly invite you to stop by, explore our new lab, and experience one of our mixed reality demos in action!


 

 

 

 

 

workshop: MidwestCon

workshop: Artistic Intelligence Mixer: Embedding and Embodying AI in Everyday Life 
Midwest Conference. 2025. Cincinnati, OH. 9.10.2025

Sponsored by: UC DAAP and FotoFocus

This dynamic, hands-on workshop invited participants to step beyond theory and into living with AI. Through interactive exercises and scenario-based challenges, attendees explored how artificial intelligence could be seamlessly woven into the fabric of daily life—not just as a tool, but as an embedded and embodied presence.

Guided by design thinking methods, we collaboratively imagined responsible, human-centered, and artistically inspired AI-driven products, systems, and experiences. Together, we examined the opportunities and tensions that arise when AI moves off the screen and into the spaces, objects, and interactions that shape our everyday world.

Participants left with new perspectives on what it means to design AI that is creative, ethical, and deeply connected to the human experience.

Speakers
Claudia B. Rebola, PhD, DAAP (College of Design, Architecture, Art, and Planning), Associate Dean for Research and Graduate Programs
Caroline Anderson, DAAP, Assistant Professor of Fine Arts
Isabel Potworowski, DAAP, Assistant Professor of Architecture
Ming Tang, University of Cincinnati, Professor, Director of Extended Reality Lab
Sangyong Cho, Assistant Professor
Heekyoung Jung, Assoicate Professor