AI-Powered Digital Human
AI-Powered Digital Humans for Enhanced Interaction in Extended Reality
Please join us at the Digital Technology Solutions AI and Emerging Technology Symposium at the University of Cincinnati.
Feb. 20, 2025 in Tangeman University Center
High-fidelity digital human.
This presentation explores two AI-driven talking avatars developed at the UC Extended Reality Lab, leveraging large language models (LLMs) for realistic interaction in XR environments. The XRLab Bot acts as a virtual tour guide, providing real-time engagement and navigation through the lab with spatial awareness, while the P&G Bot emulates a high-fidelity human likeness, delivering product expertise within a VR setting. These bots highlight advancements in AI, LLMs, and XR, showcasing potential applications in education, customer service, and smart campuses. The presentation will cover AI-driven navigation, multi-client architecture, and XR integration for immersive digital experiences.
Presenters
Ming Tang, Director of XR-Lab. Professor, DAAP, UC.
Mikhail Nikolaenko, XR-Lab Fellow, UC
Team: Aayush Kumar, Ahmad Alrefai
P&G Bot: A high-fidelity avatar modeled on a real individual, rendered with lifelike facial animations and lip synchronization within a VR environment. This bot is trained on a specialized database containing information on P&G’s shampoo products and company history. Its development process involved scanning a human model, rigging, animating, and integrating the LLM, enhanced through XR-based visualization and UI. The result is a realistic, interactive bot that combines human likeness with expert knowledge.
More information on Digital Human simulation at XR-Lab.