CVG Airport renovation

Cincinnati/Northern Kentucky International Airport (CVG) renovation project.

This dynamic course delves into designing human-centric, technologically advanced retail spaces at CVG, addressing contemporary challenges. Collaborating directly with CVG, we focus on conceptualizing the “Future CVG Experience,” exploring pivotal questions: envisioning the future look of CVG, the transformative impact of AR and VR on airport experiences, integrating the metaverse and immersive technologies into retail, and the potential for public art and recreational programs to enrich the traveler’s journey.

Faculty: Ming Tang. Director of XR-Lab, DAAP, UC. Thanks the support from Josh Edwards from CVG, and Chris Collins and Eric Camper from UC SIM. 

Twelve proposed scenarios of future CVG. 

Student: ARCH 7014. Fall. 2023.

Stephanie Ahmed, Ben Aidt, Thimesha Amarasena, Heather Cheng, Stephanie Circelli, Catherine D’Amico, Gabby Dashiell, Nikunj Deshpande, Carson Edwards, Olufemi Faminigba, Christopher Fultz, Emma Hausz, Jinfan He, Haley Heitkamp, Robin Jarrell, Emily Jaster, Bhaskar Jyoti Kalita, Analise Kandra, Sreya`Killamshetty, Japneet Kour, Thomas Magee, Mea McCormack, Sepideh Miraba, Dan O’Neill, Shailesh Padalkar, Gaurang Pawar, Urvi Prabhu, Michael Rinaldi-Eichenberg, Kelby Rippy, Will Roberts, Chris Schalk, Miles Sletto, Lizzy Sturgeon, Shruthi Sundararajan, Erika VanSlyke, Clayton Virzi, Yue Wu

The heatmap represents the fixation and gaze. 

Check more research on eye-tracking conducted by Prof. Tang at XR-Lab. >>

paper on JEC

Paper accepted in the Journal of Experimental Criminology.

Cory P. Haberman, Ming Tang, JC Barnes, Clay Driscoll, Bradley J. O’Guinn, Calvin Proffit, The Effect of Checklists on Evidence Collection During Initial Investigations A Randomized Controlled Trial in Virtual Reality. Journal of Experimental Criminology

Objective To examine the impact of an investigative checklist on evidence collection by police officers responding to a routine burglary investigation.

Methods A randomized control trial was conducted in virtual reality to test the effectiveness of an investigative checklist. Officers in the randomly assigned treatment group (n = 25) were provided with a checklist during the simulated investigation. Officers in the control group (n = 26) did not have access to the checklist at any time. The checklist included five evidence items commonly associated with burglary investigations.

Results Officers who were randomly provided with an investigative checklist were significantly more likely to collect two evidence items located outside of the virtual victim’s home. Both treatment and control officers were about equally as likely to collect three evidence items located inside the residence.

Conclusions Investigative checklists represent a promising new tool officers can use to improve evidence collection during routine investigations. More research is needed, however, to determine whether checklists improve evidence collection or case clearances in real-life settings. Virtual reality simulations provide a promising tool for collecting data in otherwise difficult or complex situations to simulate

Keywords: Investigations, Burglary, Checklists, Policing, Experiment, Randomized controlled trial

more information on this VR police training project available here. 

UHP Spring 2024

ARCH3051. UHP Spring 2024 course: 

Human-Computer Interaction in the Age of Extended Reality & Metaverse

Ming Tang. Professor, Director of Extended Reality Lab (XR-Lab). UC Digital Future.

Institute for Research in Sensing (IRiS); Industry 4.0 & 5.0 Institute (I45I)
School of Architecture and Interior Design, College of Design Architecture Art and Planning
Class Time: T. TH 12:30pm – 1:50pm. Spring Semester. 2024. 

Location: 4425 E CGC Lab @ DAAP

Note: This course is open to UG students in the UC Hornos Program

Prerequisite: There are no skills needed as prerequisites for this introduction-level course on XR.

Course Description:

This seminar course focuses on real-world problem solving using extended reality applications, with an in-depth investigation of the intersection of human-computer interaction (HCI) with immersive visualization technologies, including Virtual Reality (VR), Augmented Reality(AR), Mixed Reality (MR), Metaverse, and Digital Twin. The class will explore the new immersive experience in the virtual realm and analyze human perceptions through various sensory inputs. Students will learn both the theoretical framework and hands-on skills on XR development. The course will provide students with exposure to cutting-edge AR/VR  technologies at the Digital Future building. Students are encouraged to propose their own or group research on the subject of future HCI with XR.

Course Goals:

Students completing this course will be able to

  • Identify the real-world problems which might be solved through XR technology.
  • Identify the opportunities and challenges of building an inclusive metaverse accessible to everyone.
  • Identify theories and research methods related to XR and Metaverse.

The student will join a series of lectures hosted at the XR-Lab at the UC Digital Future building

  • History and theory related to XR (VR+AR+MR)
  • Emerging trends related to the Metaverse and its impact on the society
  • Theories related to Simulation & Simulacrum
  • Digital Human: perception and spatial experience
  • Biometric data and sensory technologies

Besides the theories, students will gain hands-on skills and demonstrate the development of XR techniques.

  • VR development or
  • AR development

Required Text(s): none

Recommended Text(s): various online lectures and readings

Students are strongly encouraged to look at Metaverse-related products and publications. As students develop their own research, these resources become invaluable in terms of inspiration and motivation. The instructor supports all attempts by students to experiment with techniques that interest and challenge them above and beyond what is taught in the classroom.

Required Materials: 

  • Access to a computer at CGC Lab at DAAP.
  • VR headset (Optional, provided by XR-Lab)

Software: Unreal Engine 5. . available at DAAP CGC Lab.

Schedule for each week.

  1. History and the workflow of metaverse development 
  2. Project management at GitHub (Inspiration from the film due) 
  3. Modeling ( XR vignette proposal due)
  4. Material & Lighting (Project development start) 
  5. Blue Print, node-based programming 
  6. Character and Animation
  7. Interaction and UX @ deploy to XR
  8. Project development time
  9. Project development time
  10. Project development time
  11. Project development time
  12. Project development time
  13. Project development time
  14. Project development time
  15. Project development time
  16. End-of-semester gallery show at DF building

Examples of student projects on XR development

“Reality is just the beginning.”

Protected: MIND VR

This content is password protected. To view it please enter your password below:

Digital Human

Facial motion animation experiments at XR-Lab. The digital human project explores high-fidelity digital human modeling and animation powered by ChatGPT and Open AI. 

We are also developing full body motion capture through VR tracking system and Unreal’s meta human.