Real-time Visualization & Virtual Reality & Augmented Reality
Explores the interactive virtual reality (VR) and Augmented Reality (AR) system, and real time rendering for architectural visualization, Human Computer Interaction, spatial behavioral and way-finding studies.

UHP Spring 2024

ARCH3051. UHP Spring 2024 course: 

Human-Computer Interaction in the Age of Extended Reality & Metaverse

Ming Tang. Professor, Director of Extended Reality Lab (XR-Lab). UC Digital Future.

Institute for Research in Sensing (IRiS); Industry 4.0 & 5.0 Institute (I45I)
School of Architecture and Interior Design, College of Design Architecture Art and Planning
Class Time: T. TH 12:30pm – 1:50pm. Spring Semester. 2024. 

Location: 4425 E CGC Lab @ DAAP

Note: This course is open to UG students in the UC Hornos Program

Prerequisite: There are no skills needed as prerequisites for this introduction-level course on XR.

Course Description:

This seminar course focuses on real-world problem solving using extended reality applications, with an in-depth investigation of the intersection of human-computer interaction (HCI) with immersive visualization technologies, including Virtual Reality (VR), Augmented Reality(AR), Mixed Reality (MR), Metaverse, and Digital Twin. The class will explore the new immersive experience in the virtual realm and analyze human perceptions through various sensory inputs. Students will learn both the theoretical framework and hands-on skills on XR development. The course will provide students with exposure to cutting-edge AR/VR  technologies at the Digital Future building. Students are encouraged to propose their own or group research on the subject of future HCI with XR.

Course Goals:

Students completing this course will be able to

  • Identify the real-world problems which might be solved through XR technology.
  • Identify the opportunities and challenges of building an inclusive metaverse accessible to everyone.
  • Identify theories and research methods related to XR and Metaverse.

The student will join a series of lectures hosted at the XR-Lab at the UC Digital Future building

  • History and theory related to XR (VR+AR+MR)
  • Emerging trends related to the Metaverse and its impact on the society
  • Theories related to Simulation & Simulacrum
  • Digital Human: perception and spatial experience
  • Biometric data and sensory technologies

Besides the theories, students will gain hands-on skills and demonstrate the development of XR techniques.

  • VR development or
  • AR development

Required Text(s): none

Recommended Text(s): various online lectures and readings

Students are strongly encouraged to look at Metaverse-related products and publications. As students develop their own research, these resources become invaluable in terms of inspiration and motivation. The instructor supports all attempts by students to experiment with techniques that interest and challenge them above and beyond what is taught in the classroom.

Required Materials: 

  • Access to a computer at CGC Lab at DAAP.
  • VR headset (Optional, provided by XR-Lab)

Software: Unreal Engine 5. . available at DAAP CGC Lab.

Schedule for each week.

  1. History and the workflow of metaverse development 
  2. Project management at GitHub (Inspiration from the film due) 
  3. Modeling ( XR vignette proposal due)
  4. Material & Lighting (Project development start) 
  5. Blue Print, node-based programming 
  6. Character and Animation
  7. Interaction and UX @ deploy to XR
  8. Project development time
  9. Project development time
  10. Project development time
  11. Project development time
  12. Project development time
  13. Project development time
  14. Project development time
  15. Project development time
  16. End-of-semester gallery show at DF building

Examples of student projects on XR development

“Reality is just the beginning.”

Cloud-based Digital Twin

Clients are one click away from interacting with a Digital Twin model on their personal devices. No installation is required.

The XR-Lab’s project showcases a cloud-based Digital Twin (DT) model, designed for accessibility and interaction via mobile devices. This advanced DT model allows multiple users to engage with its complex features directly through touch screens, eliminating the need for app installations. Clients can effortlessly access the content using a simple URL in a web browser on their personal iOS or Android mobile devices and tablets. The project is distinguished by its photorealistic renderings, which are streamed to clients at high frame rates, ensuring a visually rich and seamless experience. Furthermore, our DT model is an integration of various cutting-edge technologies, including Building Information Modeling (BIM), Metadata, IoT sensor data, 360-degree images/videos, and web3D content, creating a comprehensive and interactive digital environment.

 

More information on Future of Work: Understanding the interrelationships between humans and technology to improve the quality of work-life in smart buildings.

Kao Metaverse

The University of Cincinnati, through its Digital Futures complex, will work collaboratively with the UC Center for Simulations & Virtual Environments Research, Lindner College of Business, UC DAAP XR-Lab, and Kao to develop concepts and a minimum viable product for a Jergens virtual tanning experience called the ‘Glowverse.’

KAO-STP-Glowverse VR retail project. GLOWVERSE VIRTUAL SPA (VR) development. UC-SIM + XR-Lab +College of Business. Funded by KAO.  PI: Chris Collins. Co-PI: Ming Tang, Noah Van. $52,122. Period: 2.2023-11.2023.

  • UCSIM Project Lead: Chris M. Collins, Center for Simulations & Virtual Environments
    Research
    UCSIM Technical Lead: Ryan Gorsuch, Center for Simulations & Virtual Environments
    Research
  • UC DAAP Design Lead: Ming Tang, Director of XR-Lab. Registered Architect, RA, NCARB, LEED AP (BD+C),
    and Associate Professor at the School of Architecture and Interior Design, College of Design,
    Architecture, Art, and Planning
  • UC LCOB Marketing Lead: Noah Van Bergen, Asst. Professor, Marketing, Linder Business
    Honors Faculty, Carl H. Lindner College of Business

Solar System in VR

Extended Reality 3D Model Application in Space Exploration and Planetary Habitation

 

Developed by Mikhail NikolaenkoStudent, XR-Lab fellow, University of Cincinnati.

Summary:

This research project incorporates the use of data science, astronomy, and VR to create a visually interactive learning tool for students, academics, enthusiasts, and professionals alike to learn about areas of space exploration that will be easily accessible to anyone with a VR device such as an Oculus Quest 2. The application will include an accurate mapping of different celestial bodies, such as planets and stars, and the model will be fully interactable through functions such as scaling, time manipulation, and highlighting. The uses of this proposed application range from basic elementary applications (e.g., learning about our solar system in astronomy courses) to astronomical data research (e.g., viewing spectra of celestial objects found by Gaia)

PDFs:  Final Report; Poster

Acknowledgments: 

Project Advisor: Ming Tang

Research Funding Source: University of Cincinnati Space Research Institute for Discovery and Exploration 

External Celestial Data: This work has made use of data from the European Space Agency (ESA) mission Gaia (https://www.cosmos.esa.int/gaia), processed by the Gaia Data Processing and Analysis Consortium (DPAC,https://www.cosmos.esa.int/web/gaia/dpac/consortium). Funding for the DPAC has been provided by national institutions, in particular, the institutions participating in the Gaia Multilateral Agreement. 

Internal Solar System Data: Acton, C.H.; “Ancillary Data Services of NASA’s Navigation and Ancillary Information Facility;” Planetary and Space Science, Vol. 44, No. 1, pp. 65-70, 1996. DOI 10.1016/0032-0633(95)00107-7 

Charles Acton, Nathaniel Bachman, Boris Semenov, Edward Wright; A look toward the future in the handling of space science mission geometry; Planetary and Space Science (2017); DOI 10.1016/j.pss.2017.02.013 

References & Sources: 

1. NASA. (2018, December 14). NASA’s eyes: Eyes on exoplanets. NASA. Retrieved October 11, 2022, from https://eyes.nasa.gov/eyes-on-exoplanets.html


Demo at the 2023 Undergraduate Scholarly Showcase. 04.20.2023

EVRTalk on iHeart

Our EVRTalk project was featured at Simply Medicine | iHeart.  (time mark 9:55)

April 29, 2023. By Angenette Levy with 55KRC

  

Jai’:a Nored ( COA) , Linda Dunseath (LWC), Ming Tang (UC, XR-Lab)

SIMPLY MEDICINE is a continuing program designed to simplify the health care system and provide medical information.

Thanks for the support from COA, Live Well Collaborative, and the University of Cincinnati Urban Health Pathway grant

   

Check more information on the EVRTalk  program.

Thanks to Suzanne Burke, Ken Wilson, Jai’La Nored, Anna Goubeaux, and many others from COA. Thanks to the Live Well EVRTalk development team (Faculty: Ming Tang, Matt Anthony; advisor: Craig Vogel, Linda Dunseath; Students and Live Well fellows: Tosha Bapat, Karly Camerer, Jay Heyne, Harper Lamb, Jordan Owens, Ruby Qji, Alejandro Robledo, Matthew Spoleti, Lauren Southwood, Ryan Tinney, Keeton Yost, Dongrui Zhu.)