Posts

seminar: Design in the Age of Metaverse and Extended Reality

ARCH 7036-004 / ARCH5051.Elective Arch Theory Seminar.  Spring semester. DAAP, UC.
Design in the Age of Metaverse and Extended Reality

Instructor: Ming Tang.  Director, Extended Reality Lab. XR-Lab, Associate Professor, SAID, DAAP, University of Cincinnati

 

This seminar course focuses on the intersection of architecture design, and interior design with immersive visualization technologies, including Virtual Reality, Augmented Reality, Digital Twin, social VR, and real-time simulation. The class will explore the new spatial experience in the virtual realm and analyze human perceptions through hand-tracking, body-tracking, haptic simulation, and various sensory inputs. Students will learn both the theoretical framework and hands-on skills on XR development. The course will provide students exposure to the Oculus Quest, Teslasuit, Hololens technologies, and wearable sensors. Students are encouraged to propose their own or group research on the subject of future design with XR.

Hardware: Oculus Quest, and Hololens were provided by the course.

Student Research Project

Digital Twin

AR for community engagement. Price Hill

 

References:

Recommended podcast on Metaverse

 

 

Exhibition in Expo4Seniors

Our VR app EvR Talk is presented at two Senior Health & Wellness Expo, organized by the Expo4seniors.

  • Fairfield Community Arts Center, 411 Wessel Dr. Cincinnati, OH. 11.03.2021
  • Gray Road Church of Christ. Cincinnati, OH. 11.20.2021

Expo4Seniors provides Senior Citizens access to services and products through education, collaboration, advocacy, and accessibility in order to make Aging In Place and Lifestyle, Health & Wellness available.

Team member Karly Hasselfeld demonstrated to the audience how to interact with virtual characters through hand tracking. Photo by Karly Hasselfeld and Lauren Southwood.

Ming Tang lead a design team at the LiveWell Collaborative developed this Care Giver Training for the Council of Ageing.  More information on the VR for Caregiver training and UC Urban Health Pathway grant support can be found here.

Ever talk project ( password protected) Please reach out to COA to get access permission.

 

 

 

 

 

Eye-Tracking for Drivers’ Visual Behavior

Impacts of Work Zone Traffic Signage Devices and Environment Complexity on Drivers’ Visual Behavior and Workers Safety.

Ph.D student: Adebisi, Adekunle. CEAS – Civil & Arch Eng & Const Mgmt

Undergraduate student: Nathan Deininger, 

Faculty. Ming Tang

The objective of this study is to investigate the safety of roadway workers under varying environmental and work zone conditions. To achieve the objectives, a driving simulator-based experiment is proposed to evaluate drivers’ visual attention under various work zone scenarios using eye-tracking technologies.

Grant.

  • Using Eye- Tracking to Study the Effectiveness of Visual Communication. UHP Discovery funding. University Honor Program. UC. $5,000. Faculty advisor. 2021.
  • Adekunle Adebisi  (Ph.D student at the College of Engineering and Applied Science) applied and received a $3,200 Emerging Fellowship Award By Academic Advisory Council for Signage Research and Education (AACSRE).

 

Project Stage

Project “Stage” is a stand-alone rendering engine developed by Ming Tang, using runtime load asset, as well HDRI backdrop methods in UE4. It is a windows application for users to quickly load their FBX file into a virtual environment for first-person, and third-person walk-throughs.

This stand-alone program allows users to load external 3D models in runtime instantly. The Stage environment includes UI to load FBX models during runtime, several HDRI lighting domes, and interactive characters. The Stage promotes iterative design and encourages designers to explore the creative potentials through real-time feedback. Project bus stop and burning man were both developed using the stage as a pre-VIZ tool.

The Stage allows students to (1) present their 3D model to the reviewer without waiting for renderings. No packaging time is required. Design is ready instantly in a game environment. (2) Control an avatar to navigate space and explore the form through first-person or third-person views. (3) By loading design iterations in Stage and examining the frame rate and loading time, students learned the importance of optimizing a model. (4) Test the UV mapping, and scene hierarchy (5) Test the low-poly collision objects and navigation mesh. (5) Have fun. There is a hidden Easter egg to be discovered.

 

download windows application “Stage” here. ( zip) 660MB

( password “stage”)

Tutorial 1.  how to use Stage

Export model from Rhino or 3dsMax as FBX, Create collision object with “UCX_” preface. Use standard material. import into Stage. Customize material. Notice. You might need to clean up your mesh model in Rhino or optimize at 3dsMax before exporting to FBX.

 

Tutorial 2. how the application was built in Unreal.

Third-person character, HDRI backdrop, FBX runtime import plugin.

 

Easter Egg

There is an Easter Egg in Stage, see if you can find it. Clue:

An invisible scroll,  only the hero can see

Not in the fall, but in the green

on the plain surrounded by trees

find the trail that leads ten feet underneath

to the fiery domain of Hades

Virtual Reality for caregiver training

Assess the effectiveness of using Virtual Reality for caregiver training

Urban Health Pathway Seed Grant. PI: Ming Tang. Partner. Council on Ageing, LiveWell Collaborative. $19,844. 03. 2021-3.2022

Result: COA EVRTalk 

EVRTalk virtual reality caregiver training

 

This project aims to investigate the effectiveness of using Virtual Reality to build empathy for the care recipient by allowing the caregiver to experience day-to-day life from the care recipient’s perspective. Ming Tang leads a research team to work with COA and LiveWell Collaborative to develop and evaluate an expandable set of VR training modules designed to help train family and friends who are thrust into the caregiving role. Ming Tang lead the LWC team and design the simulated decision trees, scenarios, and hand-tracking technologies in an immersive VR environment.

Team members: Ming Tang, Matt Anthony,Craig Vogel, Linda Dunseath, Alejandro Robledo, Tosha Bapat, Karly Camerer, Jay Heyne, Harper Lamb, Jordan Owens, Ruby Qji, Matthew Spoleti, Lauren Southwood, Ryan Tinney, Keeton Yost, Dongrui Zhu

 

COA is awarded $25,000 from the CTA Foundation Grant in 2021.

In the UC News. share point.