Solar System in VR

Extended Reality 3D Model Application in Space Exploration and Planetary Habitation

 

Developed by Mikhail NikolaenkoStudent, XR-Lab fellow, University of Cincinnati.

Summary:

This research project incorporates the use of data science, astronomy, and VR to create a visually interactive learning tool for students, academics, enthusiasts, and professionals alike to learn about areas of space exploration that will be easily accessible to anyone with a VR device such as an Oculus Quest 2. The application will include an accurate mapping of different celestial bodies, such as planets and stars, and the model will be fully interactable through functions such as scaling, time manipulation, and highlighting. The uses of this proposed application range from basic elementary applications (e.g., learning about our solar system in astronomy courses) to astronomical data research (e.g., viewing spectra of celestial objects found by Gaia)

PDFs:  Final Report; Poster

Acknowledgments: 

Project Advisor: Ming Tang

Research Funding Source: University of Cincinnati Space Research Institute for Discovery and Exploration 

External Celestial Data: This work has made use of data from the European Space Agency (ESA) mission Gaia (https://www.cosmos.esa.int/gaia), processed by the Gaia Data Processing and Analysis Consortium (DPAC,https://www.cosmos.esa.int/web/gaia/dpac/consortium). Funding for the DPAC has been provided by national institutions, in particular, the institutions participating in the Gaia Multilateral Agreement. 

Internal Solar System Data: Acton, C.H.; “Ancillary Data Services of NASA’s Navigation and Ancillary Information Facility;” Planetary and Space Science, Vol. 44, No. 1, pp. 65-70, 1996. DOI 10.1016/0032-0633(95)00107-7 

Charles Acton, Nathaniel Bachman, Boris Semenov, Edward Wright; A look toward the future in the handling of space science mission geometry; Planetary and Space Science (2017); DOI 10.1016/j.pss.2017.02.013 

References & Sources: 

1. NASA. (2018, December 14). NASA’s eyes: Eyes on exoplanets. NASA. Retrieved October 11, 2022, from https://eyes.nasa.gov/eyes-on-exoplanets.html


Demo at the 2023 Undergraduate Scholarly Showcase. 04.20.2023

EVRTalk on iHeart

Our EVRTalk project was featured at Simply Medicine | iHeart.  (time mark 9:55)

April 29, 2023. By Angenette Levy with 55KRC

  

Jai’:a Nored ( COA) , Linda Dunseath (LWC), Ming Tang (UC, XR-Lab)

SIMPLY MEDICINE is a continuing program designed to simplify the health care system and provide medical information.

Thanks for the support from COA, Live Well Collaborative, and the University of Cincinnati Urban Health Pathway grant

   Check more information on the EVRTalk  program.

Thanks to Suzanne Burke, Ken Wilson, Jai’La Nored, Anna Goubeaux, and many others from COA. Thanks to the Live Well EVRTalk development team (Faculty: Ming Tang, Matt Anthony; advisor: Craig Vogel, Linda Dunseath; Students and Live Well fellows: Tosha Bapat, Karly Camerer, Jay Heyne, Harper Lamb, Jordan Owens, Ruby Qji, Alejandro Robledo, Matthew Spoleti, Lauren Southwood, Ryan Tinney, Keeton Yost, Dongrui Zhu.)

 

 

 

XR-Lab student work exhibition

Design in the Age of Metaverse and Extended Reality

End-of-semester student work exhibition @ UC Digital Future Building, April 25th, 2023

Faculty: Ming Tang, Course: ARCH 7036-04/ARCH5051-04.Elective Theory Seminar, DAAP

Students: Alkadam Almoiad, Daiani Shima, He Jinfan, Jung Chanhyo,  Junglas Hannah,  Sadki Rashid, Wahl Ethan, Yeganeh Sam, Gu Ming, Liming Brett, Schadler Simon

Projects:

XR-Lab will exhibit student projects of XR visualization designed by industrial design, architecture, computer science, and engineering students.

Sisyphus

BlackholeVR
 
Solar System
 
DAAP 2032
 
Contextual Reminder
Metahuman
 
Hemplime + Free-Form Hybrids
 
Future of Farming
 
Magic shopping
 
Apartment Setup
 
Heterotopia
The new Morabaa
 
 

paper at ASEBP

Cory P. Haberman, Ming Tang, JC Barnes, Clay Driscoll, Bradley J. O’Guinn, Calvin Proffit,. Using Virtual Reality Simulations to Study Initial Burglary Investigations. American Society of Evidence-Based Policing’s 2023 Conference. 2023. Las Vegas. Nevada. (accepted)

Thanks for the support from the Cincinnati Police Department and the University of Cincinnati Research Grant. 

Using Virtual Reality Simulations to Study Initial Burglary Investigations

Cory P. Haberman, Ming Tang, JC Barnes, Clay Driscoll, Bradley J. O’Guinn, Calvin Proffit, University of Cincinnati

In this presentation, we discuss using virtual reality to study police investigations. First, we present the results of an experiment assessing the impact of providing investigative checklists to patrol officers responding to a burglary call for service in a large midwestern police agency. Second, we discuss the lessons learned from developing virtual reality simulations with limited budgets and student-based development teams. Third, we discuss the lessons learned from using virtual reality as a data collection technique for policing research.

More information is available at  VR for Police Training

 

paper at ACSE-ICTD conference

Raman, M., Tang, M3D Visualization Development of Urban Environments for Simulated Driving Training and VR Development in Transportation Systems. ASCE ICTD 2023 Conference. Austin. TX. 06. 2023

 

This work is based on a project to develop a physics-based, 3D digital visual environment that is a replication of actual field conditions for over seventy miles of Ohio highways and city roads for use in a driving simulator for the Ohio Department of Transportation. While transportation engineering design traditionally involves 3D design in a 2D workspace to create the built environment in the context of a natural environment, this project required replication of existing natural + built environments in a 3D digital space, thereby presenting a unique challenge to develop a new, repeatable process to create a specific digital end product.

Using industry-specific software comprised of InfraWorks (urban infrastructure design), Civil 3D (terrain modeling), Rhino (3D product modeling), 3ds Max (rendering/animation), Maya (3D animation/simulation), and Python (scripting) that are traditionally dedicated to their fields, the team developed a process to integrate them outside of their intended purposes so that they could connect industry-specific functionalities to deliver a novel product that can now be utilized by multiple markets.

This process utilizes the functionalities of each software to resolve a portion of the puzzle and delivers it as a solution for the next step of development using another software. Using an iterative development cycle approach, the process bridges the gaps between the industries of Transportation Engineering, Visualization, Architecture, and Gaming to deliver the end product.

The resulting 3D digital model of the existing urban environment can now be readily used as a baseline product for any industry that would benefit from such a digital model. In transportation engineering, it can be used in Transportation Systems Planning, Surface Operations, and/or Workforce Development. In outside/connected markets, it can be used in UI-based development, interactive game-based multiplayer virtual meetings, and photo-realistic immersive models for use in VR/multiplayer exploratory environments. This process has been standardized for the digital development of existing site conditions and context for the architectural conceptualization of buildings and public spaces in the Architecture program at the University of Cincinnati. The same process has been carried into the next development phase for the Ohio Department of Transportation.

 

Project link:

Training Simulation for snowplow