Real-time Visualization & Virtual Reality & Augmented Reality
Explores the interactive virtual reality (VR) and Augmented Reality (AR) system, and real time rendering for architectural visualization, Human Computer Interaction, spatial behavioral and way-finding studies.

paper SpaceXR in HCI 2024

Our “SpaceXR: Virtual Reality and Data Mining for Astronomical Visualization ” paper is accepted at the 26th HCI International Conference.  Washington DC, USA. 29 June – 4 July 2024
Authors: Mikhail Nikolaenko, Ming Tang

Abstract

This paper presents a ” SpaceXR ” project that integrates data science, astronomy, and Virtual Reality (VR) technology to deliver an immersive and interactive educational tool. It is designed to cater to a diverse audience, including students, academics, space enthusiasts, and professionals, offering an easily accessible platform through VR headsets. This VR application offers a data-driven representation of celestial bodies, including planets and the sun within our solar system, guided by data from the NASA and Gaia databases. The VR application empowers users with interactive capabilities encompassing scaling, time manipulation, and object highlighting. The potential applications span from elementary educational contexts, such as teaching the star system in astronomy courses, to advanced astronomical research scenarios, like analyzing spectral data of celestial objects identified by Gaia and NASA. By adhering to emerging software development practices and employing a variety of conceptual frameworks, this project yields a fully immersive, precise, and user-friendly 3D VR application that relies on a real, publicly available database to map celestial objects. 

Check more project details on Solar Systems in VR. 

CVG Airport renovation

Cincinnati/Northern Kentucky International Airport (CVG) renovation project.

This dynamic course delves into designing human-centric, technologically advanced retail spaces at CVG, addressing contemporary challenges. Collaborating directly with CVG, we focus on conceptualizing the “Future CVG Experience,” exploring pivotal questions: envisioning the future look of CVG, the transformative impact of AR and VR on airport experiences, integrating the metaverse and immersive technologies into retail, and the potential for public art and recreational programs to enrich the traveler’s journey.

Faculty: Ming Tang. Director of XR-Lab, DAAP, UC. Thanks the support from Josh Edwards from CVG, and Chris Collins and Eric Camper from UC SIM. 

Twelve proposed scenarios of future CVG. 

Student: ARCH 7014. Fall. 2023.

Stephanie Ahmed, Ben Aidt, Thimesha Amarasena, Heather Cheng, Stephanie Circelli, Catherine D’Amico, Gabby Dashiell, Nikunj Deshpande, Carson Edwards, Olufemi Faminigba, Christopher Fultz, Emma Hausz, Jinfan He, Haley Heitkamp, Robin Jarrell, Emily Jaster, Bhaskar Jyoti Kalita, Analise Kandra, Sreya`Killamshetty, Japneet Kour, Thomas Magee, Mea McCormack, Sepideh Miraba, Dan O’Neill, Shailesh Padalkar, Gaurang Pawar, Urvi Prabhu, Michael Rinaldi-Eichenberg, Kelby Rippy, Will Roberts, Chris Schalk, Miles Sletto, Lizzy Sturgeon, Shruthi Sundararajan, Erika VanSlyke, Clayton Virzi, Yue Wu

The heatmap represents the fixation and gaze. 

Check more research on eye-tracking conducted by Prof. Tang at XR-Lab. >>

paper on JEC

Paper accepted in the Journal of Experimental Criminology.

Cory P. Haberman, Ming Tang, JC Barnes, Clay Driscoll, Bradley J. O’Guinn, Calvin Proffit, The Effect of Checklists on Evidence Collection During Initial Investigations A Randomized Controlled Trial in Virtual Reality. Journal of Experimental Criminology

Objective To examine the impact of an investigative checklist on evidence collection by police officers responding to a routine burglary investigation.

Methods A randomized control trial was conducted in virtual reality to test the effectiveness of an investigative checklist. Officers in the randomly assigned treatment group (n = 25) were provided with a checklist during the simulated investigation. Officers in the control group (n = 26) did not have access to the checklist at any time. The checklist included five evidence items commonly associated with burglary investigations.

Results Officers who were randomly provided with an investigative checklist were significantly more likely to collect two evidence items located outside of the virtual victim’s home. Both treatment and control officers were about equally as likely to collect three evidence items located inside the residence.

Conclusions Investigative checklists represent a promising new tool officers can use to improve evidence collection during routine investigations. More research is needed, however, to determine whether checklists improve evidence collection or case clearances in real-life settings. Virtual reality simulations provide a promising tool for collecting data in otherwise difficult or complex situations to simulate

Keywords: Investigations, Burglary, Checklists, Policing, Experiment, Randomized controlled trial

more information on this VR police training project available here. 

UHP Spring 2024

ARCH3051. UHP Spring 2024 course: 

Human-Computer Interaction in the Age of Extended Reality & Metaverse

Ming Tang. Professor, Director of Extended Reality Lab (XR-Lab). UC Digital Future.

Institute for Research in Sensing (IRiS); Industry 4.0 & 5.0 Institute (I45I)
School of Architecture and Interior Design, College of Design Architecture Art and Planning
Class Time: T. TH 12:30pm – 1:50pm. Spring Semester. 2024. 

Location: 4425 E CGC Lab @ DAAP

Note: This course is open to UG students in the UC Hornos Program

Prerequisite: There are no skills needed as prerequisites for this introduction-level course on XR.

Course Description:

This seminar course focuses on real-world problem solving using extended reality applications, with an in-depth investigation of the intersection of human-computer interaction (HCI) with immersive visualization technologies, including Virtual Reality (VR), Augmented Reality(AR), Mixed Reality (MR), Metaverse, and Digital Twin. The class will explore the new immersive experience in the virtual realm and analyze human perceptions through various sensory inputs. Students will learn both the theoretical framework and hands-on skills on XR development. The course will provide students with exposure to cutting-edge AR/VR  technologies at the Digital Future building. Students are encouraged to propose their own or group research on the subject of future HCI with XR.

Course Goals:

Students completing this course will be able to

  • Identify the real-world problems which might be solved through XR technology.
  • Identify the opportunities and challenges of building an inclusive metaverse accessible to everyone.
  • Identify theories and research methods related to XR and Metaverse.

The student will join a series of lectures hosted at the XR-Lab at the UC Digital Future building

  • History and theory related to XR (VR+AR+MR)
  • Emerging trends related to the Metaverse and its impact on the society
  • Theories related to Simulation & Simulacrum
  • Digital Human: perception and spatial experience
  • Biometric data and sensory technologies

Besides the theories, students will gain hands-on skills and demonstrate the development of XR techniques.

  • VR development or
  • AR development

Required Text(s): none

Recommended Text(s): various online lectures and readings

Students are strongly encouraged to look at Metaverse-related products and publications. As students develop their own research, these resources become invaluable in terms of inspiration and motivation. The instructor supports all attempts by students to experiment with techniques that interest and challenge them above and beyond what is taught in the classroom.

Required Materials: 

  • Access to a computer at CGC Lab at DAAP.
  • VR headset (Optional, provided by XR-Lab)

Software: Unreal Engine 5. . available at DAAP CGC Lab.

Schedule for each week.

  1. History and the workflow of metaverse development 
  2. Project management at GitHub (Inspiration from the film due) 
  3. Modeling ( XR vignette proposal due)
  4. Material & Lighting (Project development start) 
  5. Blue Print, node-based programming 
  6. Character and Animation
  7. Interaction and UX @ deploy to XR
  8. Project development time
  9. Project development time
  10. Project development time
  11. Project development time
  12. Project development time
  13. Project development time
  14. Project development time
  15. Project development time
  16. End-of-semester gallery show at DF building

Examples of student projects on XR development

“Reality is just the beginning.”

Cloud-based Digital Twin

Clients are one click away from interacting with a Digital Twin model on their personal devices. No installation is required.

The XR-Lab’s project showcases a cloud-based Digital Twin (DT) model, designed for accessibility and interaction via mobile devices. This advanced DT model allows multiple users to engage with its complex features directly through touch screens, eliminating the need for app installations. Clients can effortlessly access the content using a simple URL in a web browser on their personal iOS or Android mobile devices and tablets. The project is distinguished by its photorealistic renderings, which are streamed to clients at high frame rates, ensuring a visually rich and seamless experience. Furthermore, our DT model is an integration of various cutting-edge technologies, including Building Information Modeling (BIM), Metadata, IoT sensor data, 360-degree images/videos, and web3D content, creating a comprehensive and interactive digital environment.

 

More information on Future of Work: Understanding the interrelationships between humans and technology to improve the quality of work-life in smart buildings.