MIND VR
[This is password-protected.]
This author has yet to write their bio.Meanwhile lets just say that we are proud Ming Tang contributed a whooping 172 entries.
The digital human project explores (1) high-fidelity digital human modeling and motion capture including Full body + Facial motion capture. (2) Large Language Model integration with digital human, powered by ChatGPT and Open AI.
CVG-HOLO – WAYFINDING HOLOGRAM PROJECT XR-Lab is working with Cincinnati/Northern Kentucky International Airport (CVG), in collaboration with UC Center for Simulations & Virtual Environments Research, to Develop and demonstrate a wayfinding hologram. Evaluate the hologram signage’s performance to augment passengers’ wayfinding experience. Develop concepts of Concourse-B store renovation, integrating emerging digital technologies related to Extended […]
Use VR walkthrough for wayfinding research. Players’ routes, and walking behavior, such as head movement, are captured and evaluated. Credit: restaurant designed by Eian Bennett. More info on the wayfinding and Egress at the simulated DAAP building can be found here.
Clients are one click away from interacting with a Digital Twin model on their personal devices. No installation is required. The XR-Lab’s project showcases a cloud-based Digital Twin (DT) model, designed for accessibility and interaction via mobile devices. This advanced DT model allows multiple users to engage with its complex features directly through touch screens, […]
