Honors Seminar gallery show

Invitation to “Human-Computer Interaction in the Age of Extended Reality & Metaverse” student projects showcase.

The Extended Reality Lab at the UC Digital Futures is delighted to extend an invitation to the forthcoming gallery show, “Human-Computer Interaction in the Age of Extended Reality & Metaverse,” a pivotal showcase presented by the UC honors students of ARCH3051. This exhibition delves into the burgeoning field of extended reality (XR) and its confluence with human-computer interaction (HCI), embodying a fusion of scholarly inquiry and innovative practice.

  • Date: April 23rd, Tuesday
  • Time: 1:00 PM to 3:00 PM
  • Location: UC Digital Futures Building, 2nd Floor Lobby, 3044 Reading Road, Cincinnati, OH 45206

Under the guidance of Ming Tang, Director of the XR-Lab at Digital Futures and DAAP, UC, this honors seminar course has propelled students through an immersive journey into the realm of XR. The course encompasses Extended Reality, Metaverse, and Digital Twin technologies, providing a comprehensive platform for theoretical exploration and practical application in XR development.

The exhibition showcases an array of student-led research projects that investigate the role of XR in various domains, including medical training, flight simulation, entertainment, tourism, cultural awareness, fitness, and music. Through these projects, students have had the opportunity to not only grasp the intricate theories underpinning future HCI developments but also to apply their skills in creating immersive experiences that hint at the future of human-technology interaction.

We cordially invite you to immerse yourself in the innovative world of extended reality and explore the visionary work of our students, who are at the forefront of shaping the future of HCI. Join us for an afternoon of discovery and inspiration at the intersection of technology, creativity, and human experience.

We look forward to welcoming you to an engaging exploration of how extended reality redefines the landscape of human-computer interaction.

 

Ming Tang, Professor, Director of XR-Lab, DAAP, University of Cincinnati

Students: Nishanth Chidambaram, Bao Huynh, Caroline McCarthy, Cameron Moreland, Frank Mularcik, Cooper Pflaum, Triet Pham, Brooke Stephenson, Pranav Venkataraman

Thanks for the support from the UC Honors Program. 

Digital Twin of Cincinnati

A realtime flythrough demo for Digital Twin of City Cincinnati

Digital Futures Building at the University of Cincinnati

Destroy Alien buildings near the UC campus. Project developed by students Cooper Pflaum and Nishanth Chidambaram. 

Fluid Sim in VR

Fluid Simulation in Virtual Reality. Unreal Engine. Collision test with hand. 

Digital Twin, LLM & IIOT

IIOT for legacy and intelligent factory machines with AR and LLM feedback with a Digital Twin demonstration of real-time IOT for architecture/building applications using Omniverse.

  • PI: Prof. Sam Anand (Director of Smart-Manufacturing Lab, Dept. of Mechanical Engineering, CEAS)
  • co-PI:  Prof. Ming Tang (Director of XR-Lab, School of Architecture & Interior Design, DAAP)

$40,000. UC Industry 4.0/5.0 Institute Consortium Research Project: 01.2024-01.2025

Primary Objective: To develop a conversational large language modeling system that acquires data from legacy machines, digital machines, environmental data, real-time data, and historical data within an IIoT environment to create a digital twin for assisting in real-time maintenance and assistance (Application Use Case: Digital Future’s Building) 

Student: Sourabh Deshpande, Anuj Gautam , Manish Raj Aryal, Mikhail Nikolaenko, Aayush Kumar, Eian Bennett

 

paper SpaceXR in HCI 2024

Our “SpaceXR: Virtual Reality and Data Mining for Astronomical Visualization ” paper is accepted at the 26th HCI International Conference.  Washington DC, USA. 29 June – 4 July 2024
Authors: Mikhail Nikolaenko, Ming Tang

Abstract

This paper presents a ” SpaceXR ” project that integrates data science, astronomy, and Virtual Reality (VR) technology to deliver an immersive and interactive educational tool. It is designed to cater to a diverse audience, including students, academics, space enthusiasts, and professionals, offering an easily accessible platform through VR headsets. This VR application offers a data-driven representation of celestial bodies, including planets and the sun within our solar system, guided by data from the NASA and Gaia databases. The VR application empowers users with interactive capabilities encompassing scaling, time manipulation, and object highlighting. The potential applications span from elementary educational contexts, such as teaching the star system in astronomy courses, to advanced astronomical research scenarios, like analyzing spectral data of celestial objects identified by Gaia and NASA. By adhering to emerging software development practices and employing a variety of conceptual frameworks, this project yields a fully immersive, precise, and user-friendly 3D VR application that relies on a real, publicly available database to map celestial objects. 

Check more project details on Solar Systems in VR.