Real-time Visualization & Virtual Reality & Augmented Reality
Explores the interactive virtual reality (VR) and Augmented Reality (AR) system, and real time rendering for architectural visualization, Human Computer Interaction, spatial behavioral and way-finding studies.

Visual Impairment Sim

Our research team at the Live Well Collaborative created a Visual Impairment Simulation VR prototype to simulate glaucoma vision and peripheral vision loss in 2021. Glaucoma comprises a group of glaucomatous optic nerve damage and visual field loss disorders. It is a significant cause of blindness in the United States and is the most common cause of blindness among black Americans. An estimated 1 million Americans over 65 years of age have experienced the loss of vision associated with glaucoma, and approximately 75 percent of persons who are legally blind because of glaucoma are over the age of 65.[1]

A prototype of glaucoma VR simulation was developed by our team in 2021.  A virtual kitchen scenario was created to allow users to experience the challenges of a visual impairment person in an immersive environment. Hand-tracking technology with Oculus Quest 2 was used for creating interactions with virtual objects.

Team: Ming Tang, Ryan Tinney, Alejandro Robledo, Tosha Bapat, Linda Dunseath,  Matt Anthony @ Live Well Collaborative

Screen recording of VR prototype glaucoma scenarios in a virtual kitchen to study cooking activities.

  

Hand Tracking in VR.

[1] Pizzarello LD. The dimensions of the problem of eye disease among the elderly. Ophthalmology. 1987; 94:1191–5.

NSF: Future of Work

Ming Tang worked as a co-investigator on the project funded by the NSF Grant. 

Future of Work: Understanding the interrelationships between humans and technology to improve the quality of work-life in smart buildings.

Grant: #SES-2026594 PI:  David W. Wendell. co-PIs: Harfmann, Anton; Fry, Michael; Rebola, Claudia; co-Is: Pravin Bhiwapurkar, Ann Black, Annulla Linders, Tamara Lorenz, Nabil Nassif, John Seibert, Ming Tang, Nicholas Williams, and Danny T.Y. Wu.  01-01-2021 -12-31-2021 National Science Foundation $149,720. Awarded Level: Federal 

 

The primary goal of this proposed planning project is to assemble a diverse, multidisciplinary team of experts dedicated to devising a robust methodology for the collection, analysis, and correlation of existing discipline-specific studies and data. This endeavor focuses on buildings and their occupants, aiming to unearth previously undiscovered interactions. Our research will specifically delve into the intricate interrelationships between four key areas: 1) the overall performance of buildings, 2) the indoor and outdoor environmental conditions, 3) the physical health of the occupants, and 4) their satisfaction with the work environment. This comprehensive approach is designed to provide a holistic understanding of the dynamic between buildings and the well-being of the individuals within them.

 

Prof. Anton Harfmann developed the sensor towers.

 

Ming Tang spearheaded the development of a Digital Twin model, an innovative project integrating multiple historical sensor data sets into a comprehensive, interactive 3D model. This model encompasses several vital features: the capture, analysis, and visualization of historical data; cloud-based data distribution; seamless integration with Building Information Models (BIM); and an intuitive Web User Experience (UX). Building elements are extracted as metadata from the BIM model and then overlaid in screen-based and Virtual Reality (VR) interfaces, offering a multi-dimensional data view. Further details are available at the Cloud-based Digital Twin project for a more in-depth exploration of this work.

 

See more details on the Digital Twin workflow.

 

Exhibition in Expo4Seniors

Our VR app EvR Talk is presented at two Senior Health & Wellness Expo, organized by the Expo4seniors.

  • Fairfield Community Arts Center, 411 Wessel Dr. Cincinnati, OH. 11.03.2021
  • Gray Road Church of Christ. Cincinnati, OH. 11.20.2021

Expo4Seniors provides Senior Citizens access to services and products through education, collaboration, advocacy, and accessibility in order to make Aging In Place and Lifestyle, Health & Wellness available.

Team member Karly Hasselfeld demonstrated to the audience how to interact with virtual characters through hand tracking. Photo by Karly Hasselfeld and Lauren Southwood.

Ming Tang lead a design team at the LiveWell Collaborative developed this Care Giver Training for the Council of Ageing.  More information on the VR for Caregiver training and UC Urban Health Pathway grant support can be found here.

Ever talk project ( password protected) Please reach out to COA to get access permission.

 

 

 

 

 

Book Chapter: Cyber-Physical Experiences

Book Chapter

Turan Akman, and Ming Tang. Cyber-Physical Experiences: Architecture as Interface

in the book Virtual Aesthetics in Architecture: designing in mixed realities. Routledge, 2021.

  

Virtual Aesthetics in Architecture: Designing in Mixed Realities presents a curated selection of projects and texts contributed by leading international architects and designers who are using virtual reality technologies in their design process. It triggers discussion and debate on exploring the aesthetic potential and establishing its language as an expressive medium in architectural design. Although virtual reality is not new and the technology has evolved rapidly, the aesthetic potential of the medium is still emerging and there is a great deal more to explore.

 

Cyber-Physical Experiences: Architecture as Interface

Turan Akman [STG Design] and Ming Tang [University of Cincinnati]

Conventionally, architects have relied on the qualities of elements, such as materiality, light, solids, and voids, to break away from the static nature of space and enhance the way users experience and perceive architecture. Even though some of these elements and methods have helped create more dynamic spaces, architecture is still bound by the conventional constraints of the discipline. With the introduction of technologies such as augmented reality (AR), it is becoming easier to blend digital and physical realities and create new types of spatial qualities and experiences, especially when this is combined with virtual reality (VR) early in the design process. Although these emerging technologies cannot replace the primary and conventional qualitative elements in architecture, they can be used to supplement and enhance the experience and qualities architecture provides.

in order to explore how AR can enhance the way architecture is experienced and perceived and how VR can be used to enhance the effects of these AR additions, the authors have proposed a hybrid museum in which AR is integrated into conventional analog methods (e.g. materiality, light) to mediate spatial experiences. The authors also created a VR walkthrough and collected quantifiable data on the spatial effects of these AR additions to evaluate the proposed space.

Check more info at Chapter 9 | Cyber-physical experiences

Project Stage

Project “Stage” is a stand-alone rendering engine developed by Ming Tang, using runtime load asset, as well HDRI backdrop methods in UE4. It is a windows application for users to quickly load their FBX file into a virtual environment for first-person, and third-person walk-throughs.

This stand-alone program allows users to load external 3D models in runtime instantly. The Stage environment includes UI to load FBX models during runtime, several HDRI lighting domes, and interactive characters. The Stage promotes iterative design and encourages designers to explore the creative potentials through real-time feedback. Project bus stop and burning man were both developed using the stage as a pre-VIZ tool.

The Stage allows students to (1) present their 3D model to the reviewer without waiting for renderings. No packaging time is required. Design is ready instantly in a game environment. (2) Control an avatar to navigate space and explore the form through first-person or third-person views. (3) By loading design iterations in Stage and examining the frame rate and loading time, students learned the importance of optimizing a model. (4) Test the UV mapping, and scene hierarchy (5) Test the low-poly collision objects and navigation mesh. (5) Have fun. There is a hidden Easter egg to be discovered.

 

download windows application “Stage” here. ( zip) 660MB

( password “stage”)

Tutorial 1.  how to use Stage

Export model from Rhino or 3dsMax as FBX, Create collision object with “UCX_” preface. Use standard material. import into Stage. Customize material. Notice. You might need to clean up your mesh model in Rhino or optimize at 3dsMax before exporting to FBX.

 

Tutorial 2. how the application was built in Unreal.

Third-person character, HDRI backdrop, FBX runtime import plugin.

 

Easter Egg

There is an Easter Egg in Stage, see if you can find it. Clue:

An invisible scroll,  only the hero can see

Not in the fall, but in the green

on the plain surrounded by trees

find the trail that leads ten feet underneath

to the fiery domain of Hades