XR-Lab’s project was selected to join the Undergraduates Pursuing Research in Science and Engineering (UPRISE) Program
May 6 – July 26, 2024


Project title: Reinforcement Learning (RL) system in Game Engine

This project proposes the development of a sophisticated reinforcement learning (RL) system utilizing the robust and versatile environment of Unreal Engine 5 (UE5). The primary objective is to create a flexible and highly realistic simulation platform that can model a multitude of real-life scenarios, ranging from object recognition, urban navigation to emergency response strategies. This platform aims to significantly advance the capabilities of RL algorithms by exposing them to complex, diverse, and dynamically changing environments. Leveraging the advanced graphical and physical simulation capabilities of UE5, the project will focus on creating detailed and varied scenarios in which RL algorithms can be trained and tested. These scenarios will include, but not be limited to, urban traffic systems, natural disaster simulations, and public safety response models. The realism and intricacy of UE5’s environment will provide a challenging and rich training ground for RL models, allowing them to learn and adapt to unpredictable variables akin to those in the real world.

Honors Seminar student projects

“Human-Computer Interaction in the Age of Extended Reality & Metaverse” student projects

Spring. 2024.  UC

Under the guidance of Ming Tang, Director of the XR-Lab at Digital Futures and DAAP, UC, this honors seminar course has propelled students through an immersive journey into the realm of XR. The course encompasses Extended Reality, Metaverse, and Digital Twin technologies, providing a comprehensive platform for theoretical exploration and practical application in XR development.

The coursework showcases an array of student-led research projects that investigate the role of XR in various domains, including medical training, flight simulation, entertainment, tourism, cultural awareness, fitness, and music. Through these projects, students have had the opportunity to not only grasp the intricate theories underpinning future HCI developments but also to apply their skills in creating immersive experiences that hint at the future of human-technology interaction.


 “Human-Computer Interaction in the Age of Extended Reality & Metaverse” is a UC Honors course that delves into the burgeoning field of extended reality (XR) and its confluence with human-computer interaction (HCI), embodying a fusion of scholarly inquiry and innovative practice.

Ming Tang, Professor, Director of XR-Lab, DAAP, University of Cincinnati

Students: Nishanth Chidambaram, Bao Huynh, Caroline McCarthy, Cameron Moreland, Frank Mularcik, Cooper Pflaum, Triet Pham, Brooke Stephenson, Pranav Venkataraman

Thanks for the support from the UC Honors Program and UC Digital Futures.

Digital Twin of Cincinnati

A realtime flythrough demo for Digital Twin of City Cincinnati

Digital Futures Building at the University of Cincinnati

Destroy Alien buildings near the UC campus. Project developed by students Cooper Pflaum and Nishanth Chidambaram. 

P&G Metaverse

Title: Leveraging Metaverse Platforms for Enhanced Global Professional Networking – Phase 1

Ming Tang, Principal Investigator

Amount: $32,416

Funding: P&G Digital Accelerator.

Gotal: Metaverse technologies for global engagement. Human-Computer Interaction, Digital Human. Immersive visulzation. 

  • P&G Team. : Elsie Urdaneta, Sam Azeba. Paula Saldarriaga
  • UC Team: Ming Tang, Students: Nathaniel Brunner, Ahmad Alrefai, Sid Urankar



paper SpaceXR in HCI 2024

Our “SpaceXR: Virtual Reality and Data Mining for Astronomical Visualization ” paper is accepted at the 26th HCI International Conference.  Washington DC, USA. 29 June – 4 July 2024
Authors: Mikhail Nikolaenko, Ming Tang


This paper presents a ” SpaceXR ” project that integrates data science, astronomy, and Virtual Reality (VR) technology to deliver an immersive and interactive educational tool. It is designed to cater to a diverse audience, including students, academics, space enthusiasts, and professionals, offering an easily accessible platform through VR headsets. This VR application offers a data-driven representation of celestial bodies, including planets and the sun within our solar system, guided by data from the NASA and Gaia databases. The VR application empowers users with interactive capabilities encompassing scaling, time manipulation, and object highlighting. The potential applications span from elementary educational contexts, such as teaching the star system in astronomy courses, to advanced astronomical research scenarios, like analyzing spectral data of celestial objects identified by Gaia and NASA. By adhering to emerging software development practices and employing a variety of conceptual frameworks, this project yields a fully immersive, precise, and user-friendly 3D VR application that relies on a real, publicly available database to map celestial objects. 

Check more project details on Solar Systems in VR.