Real-time Visualization & Virtual Reality & Augmented Reality
Explores the interactive virtual reality (VR) and Augmented Reality (AR) system, and real time rendering for architectural visualization, Human Computer Interaction, spatial behavioral and way-finding studies.

paper SpaceXR in HCI 2024

SpaceXR: Virtual Reality and Data Mining for Astronomical Visualization ” is published at the 26th HCI International Conference. Proceeding Book.  Washington DC, USA. 29 June – 4 July 2024
Authors: Mikhail Nikolaenko, Ming Tang

 

Abstract

This paper presents a ” SpaceXR ” project that integrates data science, astronomy, and Virtual Reality (VR) technology to deliver an immersive and interactive educational tool. It is designed to cater to a diverse audience, including students, academics, space enthusiasts, and professionals, offering an easily accessible platform through VR headsets. This VR application offers a data-driven representation of celestial bodies, including planets and the sun within our solar system, guided by data from the NASA and Gaia databases. The VR application empowers users with interactive capabilities encompassing scaling, time manipulation, and object highlighting. The potential applications span from elementary educational contexts, such as teaching the star system in astronomy courses, to advanced astronomical research scenarios, like analyzing spectral data of celestial objects identified by Gaia and NASA. By adhering to emerging software development practices and employing a variety of conceptual frameworks, this project yields a fully immersive, precise, and user-friendly 3D VR application that relies on a real, publicly available database to map celestial objects. 

Check more project details on Solar Systems in VR. 

 

Mikhail Nikolaenko presented the paper at the 26th HCI International Conference. Washington DC, USA. 29 June – 4 July 2024

CVG Airport renovation

Cincinnati/Northern Kentucky International Airport (CVG) renovation project.

This dynamic course delves into designing human-centric, technologically advanced retail spaces at CVG, addressing contemporary challenges. Collaborating directly with CVG, we focus on conceptualizing the “Future CVG Experience,” exploring pivotal questions: envisioning the future look of CVG, the transformative impact of AR and VR on airport experiences, integrating the metaverse and immersive technologies into retail, and the potential for public art and recreational programs to enrich the traveler’s journey.

Faculty: Ming Tang. Director of XR-Lab, DAAP, UC. Thanks the support from Josh Edwards from CVG, and Chris Collins and Eric Camper from UC SIM. 

Twelve proposed scenarios of future CVG. 

Student: ARCH 7014. Fall. 2023.

Stephanie Ahmed, Ben Aidt, Thimesha Amarasena, Heather Cheng, Stephanie Circelli, Catherine D’Amico, Gabby Dashiell, Nikunj Deshpande, Carson Edwards, Olufemi Faminigba, Christopher Fultz, Emma Hausz, Jinfan He, Haley Heitkamp, Robin Jarrell, Emily Jaster, Bhaskar Jyoti Kalita, Analise Kandra, Sreya`Killamshetty, Japneet Kour, Thomas Magee, Mea McCormack, Sepideh Miraba, Dan O’Neill, Shailesh Padalkar, Gaurang Pawar, Urvi Prabhu, Michael Rinaldi-Eichenberg, Kelby Rippy, Will Roberts, Chris Schalk, Miles Sletto, Lizzy Sturgeon, Shruthi Sundararajan, Erika VanSlyke, Clayton Virzi, Yue Wu

The heatmap represents the fixation and gaze. 

Check more research on eye-tracking conducted by Prof. Tang at XR-Lab. >>

paper on JEC

Paper published in the Journal of Experimental Criminology.

Cory P. Haberman, Ming Tang, JC Barnes, Clay Driscoll, Bradley J. O’Guinn, Calvin Proffit, The Effect of Checklists on Evidence Collection During Initial Investigations A Randomized Controlled Trial in Virtual Reality. Journal of Experimental Criminology

Objective To examine the impact of an investigative checklist on evidence collection by police officers responding to a routine burglary investigation.

Methods A randomized control trial was conducted in virtual reality to test the effectiveness of an investigative checklist. Officers in the randomly assigned treatment group (n = 25) were provided with a checklist during the simulated investigation. Officers in the control group (n = 26) did not have access to the checklist at any time. The checklist included five evidence items commonly associated with burglary investigations.

Results Officers who were randomly provided with an investigative checklist were significantly more likely to collect two evidence items located outside of the virtual victim’s home. Both treatment and control officers were about equally as likely to collect three evidence items located inside the residence.

Conclusions Investigative checklists represent a promising new tool officers can use to improve evidence collection during routine investigations. More research is needed, however, to determine whether checklists improve evidence collection or case clearances in real-life settings. Virtual reality simulations provide a promising tool for collecting data in otherwise difficult or complex situations to simulate

Keywords: Investigations, Burglary, Checklists, Policing, Experiment, Randomized controlled trial

more information on this VR police training project available here. 

UHP Spring 2024

ARCH3051. UHP Spring 2024 course: 

Human-Computer Interaction in the Age of Extended Reality & Metaverse

Ming Tang. Professor, Director of Extended Reality Lab (XR-Lab). UC Digital Future.

Institute for Research in Sensing (IRiS); Industry 4.0 & 5.0 Institute (I45I)
School of Architecture and Interior Design, College of Design Architecture Art and Planning
Class Time: T. TH 12:30pm – 1:50pm. Spring Semester. 2024. 

Location: 4425 E CGC Lab @ DAAP

Note: This course is open to UG students in the UC Hornos Program

Prerequisite: There are no skills needed as prerequisites for this introduction-level course on XR.

Read more

Cloud-based Digital Twin

Clients are one click away from interacting with a Digital Twin model on their personal devices. No installation is required.

The XR-Lab’s project showcases a cloud-based Digital Twin (DT) model, designed for accessibility and interaction via mobile devices. This advanced DT model allows multiple users to engage with its complex features directly through touch screens, eliminating the need for app installations. Clients can effortlessly access the content using a simple URL in a web browser on their personal iOS or Android mobile devices and tablets. The project is distinguished by its photorealistic renderings, which are streamed to clients at high frame rates, ensuring a visually rich and seamless experience. Furthermore, our DT model is an integration of various cutting-edge technologies, including Building Information Modeling (BIM), Metadata, IoT sensor data, 360-degree images/videos, and web3D content, creating a comprehensive and interactive digital environment.

 

More information on Future of Work: Understanding the interrelationships between humans and technology to improve the quality of work-life in smart buildings.