Entries by Ming Tang

Digital Human

The digital human project explores (1) high-fidelity digital human modeling and motion capture including Full body + Facial motion capture. (2) Large Language Model integration with digital human, powered by ChatGPT and Open AI.  We are also developing full body motion capture through VR tracking system and Unreal’s meta human.     


CVG-HOLO – WAYFINDING HOLOGRAM PROJECT XR-Lab is working with Cincinnati/Northern Kentucky International Airport (CVG), in collaboration with UC Center for Simulations & Virtual Environments Research, to Develop and demonstrate a wayfinding hologram. Evaluate the hologram signage’s performance to augment passengers’ wayfinding experience. Develop concepts of Concourse-B store renovation, integrating emerging digital technologies related to Extended […]

Wayfinding through VR

Use VR walkthrough for wayfinding research. Players’ routes, and walking behavior, such as head movement, are captured and evaluated. Credit: restaurant designed by Eian Bennett. More info on the wayfinding and Egress at the simulated DAAP building can be found here.  

Cloud-based Digital Twin

Clients are one click away from interacting with a Digital Twin model on their personal devices. No installation is required. The XR-Lab’s project showcases a cloud-based Digital Twin (DT) model, designed for accessibility and interaction via mobile devices. This advanced DT model allows multiple users to engage with its complex features directly through touch screens, […]

Kao Metaverse

The University of Cincinnati, through its Digital Futures complex, will work collaboratively with the UC Center for Simulations & Virtual Environments Research, Lindner College of Business, UC DAAP XR-Lab, and Kao to develop concepts and a minimum viable product for a Jergens virtual tanning experience called the ‘Glowverse.’ KAO-STP-Glowverse VR retail project. GLOWVERSE VIRTUAL SPA […]