Posts

P&G Metaverse

Title: Leveraging Metaverse Platforms for Enhanced Global Professional Networking – Phase 1

Ming Tang, Principal Investigator

Amount: $32,416

Funding: P&G Digital Accelerator.

Gotal: metaverse technologies for global engagement

 

paper SpaceXR in HCI 2024

Our “SpaceXR: Virtual Reality and Data Mining for Astronomical Visualization ” paper is accepted at the 26th HCI International Conference.  Washington DC, USA. 29 June – 4 July 2024
Authors: Mikhail Nikolaenko, Ming Tang

Abstract

This paper presents a ” SpaceXR ” project that integrates data science, astronomy, and Virtual Reality (VR) technology to deliver an immersive and interactive educational tool. It is designed to cater to a diverse audience, including students, academics, space enthusiasts, and professionals, offering an easily accessible platform through VR headsets. This VR application offers a data-driven representation of celestial bodies, including planets and the sun within our solar system, guided by data from the NASA and Gaia databases. The VR application empowers users with interactive capabilities encompassing scaling, time manipulation, and object highlighting. The potential applications span from elementary educational contexts, such as teaching the star system in astronomy courses, to advanced astronomical research scenarios, like analyzing spectral data of celestial objects identified by Gaia and NASA. By adhering to emerging software development practices and employing a variety of conceptual frameworks, this project yields a fully immersive, precise, and user-friendly 3D VR application that relies on a real, publicly available database to map celestial objects. 

Check more project details on Solar Systems in VR. 

Digital Human

Facial motion animation experiments at XR-Lab. The digital human project explores high-fidelity digital human modeling and animation powered by ChatGPT and Open AI. 

We are also developing full body motion capture through VR tracking system and Unreal’s meta human. 

 

 

paper at ACSE-ICTD conference

Raman, M., Tang, M3D Visualization Development of Urban Environments for Simulated Driving Training and VR Development in Transportation Systems. ASCE ICTD 2023 Conference. Austin. TX. 06. 2023

 

This work is based on a project to develop a physics-based, 3D digital visual environment that is a replication of actual field conditions for over seventy miles of Ohio highways and city roads for use in a driving simulator for the Ohio Department of Transportation. While transportation engineering design traditionally involves 3D design in a 2D workspace to create the built environment in the context of a natural environment, this project required replication of existing natural + built environments in a 3D digital space, thereby presenting a unique challenge to develop a new, repeatable process to create a specific digital end product.

Using industry-specific software comprised of InfraWorks (urban infrastructure design), Civil 3D (terrain modeling), Rhino (3D product modeling), 3ds Max (rendering/animation), Maya (3D animation/simulation), and Python (scripting) that are traditionally dedicated to their fields, the team developed a process to integrate them outside of their intended purposes so that they could connect industry-specific functionalities to deliver a novel product that can now be utilized by multiple markets.

This process utilizes the functionalities of each software to resolve a portion of the puzzle and delivers it as a solution for the next step of development using another software. Using an iterative development cycle approach, the process bridges the gaps between the industries of Transportation Engineering, Visualization, Architecture, and Gaming to deliver the end product.

The resulting 3D digital model of the existing urban environment can now be readily used as a baseline product for any industry that would benefit from such a digital model. In transportation engineering, it can be used in Transportation Systems Planning, Surface Operations, and/or Workforce Development. In outside/connected markets, it can be used in UI-based development, interactive game-based multiplayer virtual meetings, and photo-realistic immersive models for use in VR/multiplayer exploratory environments. This process has been standardized for the digital development of existing site conditions and context for the architectural conceptualization of buildings and public spaces in the Architecture program at the University of Cincinnati. The same process has been carried into the next development phase for the Ohio Department of Transportation.

 

Project link:

Training Simulation for snowplow

XR Lab moved in Digital Futures

As the director of the Extended Reality lab (XR-Lab), I am thrilled to report that our XR-Lab has moved into the new Digital Futures Building at UC.

Our Lab will continue to broaden the scope of collaborations, using our expertise in both academic and professional fields in Virtual Reality, Augmented Reality, and Mixed Reality. We look forward to the long-standing collaborative relationships with faculty at the UC Digital Futures, Criminology and Justice program at CECH, Civil Engineering program, and transportation program at CEAS, Live Well Collaborative, Council on Ageing, Cincinnati Insurance Company, and Cincinnati Children’s Hospital and Medical Center.

Please visit our lab after August 2022 to check out the exciting lab space and facilities at the new UC Digital Future Building.

Location:

Room: 200 at Smart Connected Wing, & 207 VR?AR Center, Digital Future Building

3044 Reading Road, Cincinnati, OH 45206

Request a tour

XR-Lab projects

Contact:

Ming Tang, tangmg@ucmail.uc.edu

Director of Extended Reality Lab, University of Cincinnati.