UHP Spring 2024

ARCH3051. UHP Spring 2024 course: 

Human-Computer Interaction in the Age of Extended Reality & Metaverse

Ming Tang. Professor, Director of Extended Reality Lab (XR-Lab). UC Digital Future.

Institute for Research in Sensing (IRiS); Industry 4.0 & 5.0 Institute (I45I)
School of Architecture and Interior Design, College of Design Architecture Art and Planning
Class Time: T. TH 12:30pm – 1:50pm. Spring Semester. 2024. 

Location: 4425 E CGC Lab @ DAAP

Note: This course is open to UG students in the UC Hornos Program

Prerequisite: There are no skills needed as prerequisites for this introduction-level course on XR.

Course Description:

This seminar course focuses on real-world problem solving using extended reality applications, with an in-depth investigation of the intersection of human-computer interaction (HCI) with immersive visualization technologies, including Virtual Reality (VR), Augmented Reality(AR), Mixed Reality (MR), Metaverse, and Digital Twin. The class will explore the new immersive experience in the virtual realm and analyze human perceptions through various sensory inputs. Students will learn both the theoretical framework and hands-on skills on XR development. The course will provide students with exposure to cutting-edge AR/VR  technologies at the Digital Future building. Students are encouraged to propose their own or group research on the subject of future HCI with XR.

Course Goals:

Students completing this course will be able to

  • Identify the real-world problems which might be solved through XR technology.
  • Identify the opportunities and challenges of building an inclusive metaverse accessible to everyone.
  • Identify theories and research methods related to XR and Metaverse.

The student will join a series of lectures hosted at the XR-Lab at the UC Digital Future building

  • History and theory related to XR (VR+AR+MR)
  • Emerging trends related to the Metaverse and its impact on the society
  • Theories related to Simulation & Simulacrum
  • Digital Human: perception and spatial experience
  • Biometric data and sensory technologies

Besides the theories, students will gain hands-on skills and demonstrate the development of XR techniques.

  • VR development or
  • AR development

Required Text(s): none

Recommended Text(s): various online lectures and readings

Students are strongly encouraged to look at Metaverse-related products and publications. As students develop their own research, these resources become invaluable in terms of inspiration and motivation. The instructor supports all attempts by students to experiment with techniques that interest and challenge them above and beyond what is taught in the classroom.

Required Materials: 

  • Access to a computer at CGC Lab at DAAP.
  • VR headset (Optional, provided by XR-Lab)

Software: Unreal Engine 5. . available at DAAP CGC Lab.

Schedule for each week.

  1. History and the workflow of metaverse development 
  2. Project management at GitHub (Inspiration from the film due) 
  3. Modeling ( XR vignette proposal due)
  4. Material & Lighting (Project development start) 
  5. Blue Print, node-based programming 
  6. Character and Animation
  7. Interaction and UX @ deploy to XR
  8. Project development time
  9. Project development time
  10. Project development time
  11. Project development time
  12. Project development time
  13. Project development time
  14. Project development time
  15. Project development time
  16. End-of-semester gallery show at DF building

Examples of student projects on XR development

Digital Human

Facial motion animation experiments at XR-Lab. The digital human project explores high-fidelity digital human modeling and animation powered by ChatGPT and Open AI. 



XR-Lab is working with Cincinnati/Northern Kentucky International Airport (CVG), in collaboration with UC Center for Simulations & Virtual Environments Research, to

  1. Develop and demonstrate a wayfinding hologram.
  2. Evaluate the hologram signage’s performance to augment passengers’ wayfinding experience.
  3. Develop concepts of Concourse-B store renovation, integrating emerging digital technologies related to Extended Reality
  4. Develop a digital twin model of the CVG Concourse-B store area.

The project will apply various methods, including eye-tracking, motion capture, motion tracking, and computer vision.

Project Client: Josh Edwards, Sr. Manager, Innovation Cincinnati/Northern Kentucky International Airport

UC Team:

  • eXtended Reality Lab: Ming Tang, Director eXtended Reality Lab Digital Futures tangmg@ucmail.uc.edu
  • UCSIM Project Lead: Chris M. Collins.  Director. Center for Simulations & Virtual Environments Research
  • ARCH 7014 students. Fall. 2023

Thanks for the support from the UHP Discovery Summer program. 

Check out more way-finding research projects and publications at XR-Lab. 

Wayfinding through VR

Use VR walkthrough for wayfinding research. Players’ routes, and walking behavior, such as head movement, are captured and evaluated.

Credit: restaurant designed by Eian Bennett.
More info on the wayfinding and Egress at the simulated DAAP building can be found here.


Cloud-based Digital Twin

Clients are one click away from interacting with a Digital Twin model on their personal devices. No installation is required.

Demonstration of XR-Lab’s project on cloud-based Digital Twin(DT), accessible through mobile devices. Multiple users can interact with a complicated DT model through touch screens. Clients do not need to install apps and can access the content with a simple URL and web browser via personal IOS or Android mobile devices and tablets. Photorealistic renderings are streamed to clients with high-frame-rate. Also, our DT model integrates Building Information Modeling, Metadata, IOT sensor data, 360 image/video, and web3D content.

More information on Future of Work: Understanding the interrelationships between humans and technology to improve the quality of work-life in smart buildings.