Posts

AI and Emerging Technology Symposium

Ming Tang and Mikhail Nikolaenko presented “AI-Powered Digital Humans for Enhanced Interaction in Extended Reality” at the AI and Emerging Technology Symposium, University of Cincinnati.

The day-long event explored topics around AI and robotic process automation; smart campus innovation; and extended reality, virtual reality, and augmented reality. More on UC News.

AI-Powered Talking Avatars for Enhanced Interaction in Extended Reality

Presenter. Ming Tang, Mikhail Nikolaenko. Feb. 20, 2025 in Tangeman University Center. 

This presentation explores two AI-driven talking avatars developed at the UC Extended Reality (XR) Lab, leveraging large language models (LLMs) for realistic interaction in XR environments. The XRLab Bot acts as a virtual tour guide, providing real-time engagement and navigation through the lab with spatial awareness, while the P&G Bot emulates a high-fidelity human likeness, delivering product expertise within a VR setting. These bots highlight advancements in AI, LLMs, and XR, showcasing potential applications in education, customer service, and smart campuses. The presentation will cover AI-driven navigation, multi-client architecture, and XR integration for immersive digital experiences. The session will showcase AI-driven navigation and interaction, demonstrating the bot’s capabilities in translating speech-to-text using Whisper AI, retrieving responses from ChatGPT, and interpreting real-time visitor needs and spatial data to guide users throughout XRLab. It will explore the multi-client, real-time architecture by sharing insights on managing multiple Unreal and Python clients with a central server, coordinating bot actions, face tracking, and area-specific responses in real-time. The discussion will highlight XR integration and smart campus applications, emphasizing the bot’s adaptability within XR platforms using Unreal Engine and its potential for virtual and augmented reality applications in campus tours, orientations, and educational experiences. Additionally, the session will discuss LLM-driven conversational AI, utilizing advanced models to power sophisticated, natural language interactions with users. High-fidelity 3D avatar creation will be addressed, focusing on crafting detailed, lifelike avatars capable of mimicking human expressions and movements. It will also cover customizable AI for chat avatars, enabling personalized, AI-driven avatars tailored to specific user preferences and needs. Interactive avatars with facial animation and motion capture will be demonstrated, showing how avatars can exhibit dynamic facial expressions and reactions during interactions. The session will also explore metaverse creation, showcasing the development of immersive, interconnected virtual worlds where users can interact through their avatars. Finally, the discussion will include virtual reality (VR) and augmented reality (AR) environments and experiences, highlighting their ability to blend digital content with the physical world or create entirely virtual spaces.

High-fidelity digital human.

Read more

workshop at IASDR conference

Virtual Reality and Augmented Reality for design research

IASDR_DAAP

IASDR Conference

Nov 2nd at 1:10 PM until 3:00 PM

Room 6221, DAAP, University of Cincinnati

With the recent development of head-mounted display (HMD), both Virtual Reality (VR) and Augmented Reality (AR) are being reintroduced as Mixed Reality (MR) instruments into the design industry. It is never so easy for us to design, visualize, and interact with the immersive virtual world. This session will investigate the workflow to visualize a design concept trough AR and VR. We will use virtual DAAP project as a vehicle to explore 3D simulation with mixed reality. Using the DAAP building at the University of Cincinnati as a wayfinding case study, the multi-phase approach starts with defining the immersive system, which is used for capturing participants’ movement within a digital environment to form raw data in the cloud, and then visualize it with heat-map and path network. Combined with graphs, survey data is also used to compare various agents’ wayfinding behavioral related to gender, spatial recognition level, and spatial features such as light, sound, and architectural elements. The project also compares mixed reality technique with the space syntax and multi-agent system as wayfinding modeling methods.

This session includes both a seminar and project demonstration format and develops techniques for VR and AR technology as they influence the process of visualizing and forming human and computer interactions. The session will discuss the connections among different immersive techniques for real-time visualization as a critical methodology in the design process. The session will also examine the current technical, physiological and cognitive constraints relate to the immersion and interaction in virtual reality and augmented reality.

Check online schedule