Posts

XR and Gen-AI Technologies in Design

 

  

Left: VR training on welding, Samantha Frickel.  Right: Cinematic universes. Carson Edwards

Extended Reality and Generative-AI in Human-Centered Design

UHP + Architecture Seminar

Student work from the University of Cincinnati’s Honors Seminar and Architecture Design Seminar. This video showcases multiple innovative projects intersecting emerging technologies such has AIGC, XR with human-centered design.The projects include a wide range of demonstrations in the following two categories: 

Training
The first category centers on Virtual Reality-based training applications designed to simulate real-world tasks and enhance learning through immersive experiences. These projects include simulations for welding, firefighter robotics, and driving and instructional environments such as baby car seat installation. Each scenario provides a controlled, repeatable setting for learners to gain confidence and skills in safety-critical and technical domains, demonstrating the practical potential of XR technologies in professional training and education. Digital 3D content creation was augmented by various AIGC tools such as Rodin, Meshy, Tripo, etc.

Future Environment
This group of projects explores imaginative and speculative environments through immersive technologies. Students and researchers have developed experiences ranging from fictional music spaces, virtual zoos, and animal shelters to emotionally responsive architectural designs and future cityscapes. These environments often incorporate interactive elements, such as Augmented Reality on mobile devices or real-time simulations of natural phenomena like flooding. Advanced material simulation is also a focus, including simulating cloth and other soft fabrics that respond dynamically to user interaction. 2D Content creation was augmented by various AIGC tools such as Midjourney, Stable Diffusion, etc.

Read more

SMAT: Scalable Multi-Agent AI for DT

SMAT: Scalable Multi-Agent Machine Learning and Collaborative AI for Digital Twin Platform of Infrastructure and Facility Operations.

Principal Investigators:

  • Prof. Sam Anand, Department of Mechanical Engineering, CEAS
  • Prof. Ming Tang, Extended Reality Lab, Digital Futures, DAAP

Students: Anuj Gautam, Manish Aryal, Aayush Kumar, Ahmad Alrefai, Rohit Ramesh, Mikhail Nikolaenko, Bozhi Peng

Grant: $40,000. UC Industry 4.0/5.0 Institute Consortium Research Project: 03.2025-01.2026

Read more

AI and Emerging Technology Symposium

Ming Tang and Mikhail Nikolaenko presented “AI-Powered Digital Humans for Enhanced Interaction in Extended Reality” at the AI and Emerging Technology Symposium, University of Cincinnati.

The day-long event explored topics around AI and robotic process automation; smart campus innovation; and extended reality, virtual reality, and augmented reality. More on UC News.

AI-Powered Talking Avatars for Enhanced Interaction in Extended Reality

Presenter. Ming Tang, Mikhail Nikolaenko. Feb. 20, 2025 in Tangeman University Center. 

Read more

VR Training on Issues of Youth Firearm Possession

 Virtual Reality Training on Issues of Youth Firearm Possession.

PI. Tang. 8/5/2024-8/4/2025.

 

Funded by the God.Restoring.Order (GRO) Community, this research project will develop two VR scenarios that simulate environments designed to educate youth on applying critical skills in risky situations.

Team: Ming Tang, XR-Lab, Aaron Mallory, GRO.

XR-Lab students: Aayush Kumar, Mario Bermejo, Jonny Peng, Ahmad Alrefai, Rohit Ramesh, Charlotte Bodie 

The XR-Lab collaborated with the GRO community to leverage advanced extended reality (XR) technologies in the development of a virtual reality (VR) training application designed to strengthen the curriculum by reinforcing key competencies through immersive learning activities. In partnership, we evaluated the feasibility of integrating VR technology into the GRO training program, providing participants with an engaging narrative framework while equipping them with practical knowledge applicable to real-world contexts. The immersive VR scenarios addressed high-risk situations, including firearm possession and substance use, thereby creating a controlled environment for experiential learning and skill development.

The XR-Lab has harnessed advanced motion capture technology in this project to translate the movements of real people into lifelike digital characters. Every gesture, shift in posture, and subtle facial expression is carefully recorded and mapped to ensure authenticity and emotional depth in the virtual environment.

Our development team has worked closely and continuously with the GRO community, engaging in multiple motion studies, rehearsals, and testing sessions. This collaboration allows us to capture not just movement, but the nuance behind each action — the personality conveyed through body language, and the emotional context embedded in facial expression.

Through this process, the digital characters become more than avatars; they become authentic extensions of human experience, reflecting the stories. The result is an immersive, emotionally resonant experience where technology and humanity move together.