call for enrollment: Metaverse development

Design in the Age of Metaverse and Extended Reality

ARCH 7036-004 (51920)  / ARCH5051-004 (51921). Elective Arch Theory Seminar.  Spring semester. DAAP, UC.

Class time: Tuesdays and Thursdays. 12:30 pm-1:50 pm

Location:  Hybrid with online; CGC Lab @ DAAP; XR-Lab @ Digital Future Building.

Instructor: Ming Tang.  Director, Extended Reality Lab. XR-Lab, Associate Professor, SAID, DAAP, University of Cincinnati

The course is open to all UC students of any major. The course is open to both undergraduate (ARCH5051)  and graduate students (ARCH 7036). Please contact Prof. Ming Tang if you have issues registering for the course as a student from another major.

  • For SAID graduate student who has previously taken the VIZ-3 course, this advanced-level course will allow you to develop your XR-based presentations for thesis exhibition or propose your XR-related topics.
  • For students with general design backgrounds ( architecture, interior design, urban planning, landscape, industrial design, art, game, film, and digital media), this course will allow you to extend your creativity into XR and create an immersive experience.
  • For students without previous experience in design fields (engineering, art & science, CCM, medicine, etc.) this course will be an excellent introduction to the realm of XR.

This seminar course focuses on the intersection of traditional design with immersive visualization technologies, including Virtual Reality, Augmented Reality, Digital Twin, and Metaverse. The class will explore the new spatial experience in the virtual realm and analyze human perceptions, interactions, and behavior in the virtual world. Students will learn both the theoretical framework and hands-on skills on XR development. The course will provide students exposure to the XR technologies, supported by the new XR-Lab at the UC digital future building. Students are encouraged to propose their own or group research on future design with XR.

Hardware: VR headsets and glasses are available at the XR-Lab at DF.

Software: Unreal Engine 5. . available at DAAP CGC Lab.

Course Goals and Outcome

Students are encouraged to look at architecture, motion pictures, game design, 3D design, and other digital art media. As students develop their research, these fields become invaluable in inspiration and motivation. The instructor supports all attempts by students to experiment with techniques that interest and challenge them above and beyond what is taught in the classroom.

Students completing this course will be able to identify theories and research methods related to eXtended Reality (XR), including (1) History and theory related to XR (VR+AR+MR); (2) Emerging trend related to the Metaverse and its impact on design.

The course focus on hands-on skills in XR development through Unreal Engine 5. Students will propose their XR projects and develop them over the semester. 

This course will be offered as a hybrid of online and in-person at CGC/Digital Future building based on the project’s progress. The student will propose their projects and use 15 weeks to complete the development.

Prerequisite: There are no skills needed as prerequisites. However, Basic 3D modeling skills are preferred.

Schedule for each week.

  1. History and the workflow of metaverse development 
  2. project management at GitHub (Inspiration from the film due) 
  3. Modeling ( XR vignette proposal due)
  4. Material & Lighting (Project development start) 
  5. Blue Print, node-based programming 
  6. Character and Animation
  7. Interaction and UX @ deploy to XR
  8. Project development time
  9. Project development time
  10. Project development time
  11. Project development time
  12. Project development time
  13. Project development time
  14. Project development time
  15. Project development time
  16. End-of-semester gallery show at DF building

Examples of student projects on XR development

Recommended podcast on Metaverse

 

new website of XR-Lab

XR-Lab’s new website is completed.

eXtended Reality Lab (XR-Lab) is a research lab affiliated with the College of DAAP and Digital Futures at the University of Cincinnati.  XR-Lab explores how people engage with simulated scenarios through Virtual Reality, Augmented Reality, and Mixed Reality and promotes communication and collaboration through XR for education, health, safety, and professional training in both academic and industry settings.

 


Technologies have radically altered and reconstructed the relationships between humans, computation, and the environment. As the diagram shows, we believe AI, Digital twins, Metaverse, and XR are essential for fusing design and computation to connect humans. These technologies provide promising opportunities to empower us to rebuild a better world, physically and digitally.

We look forward to sharing our expertise and lab resources, incubating new ideas, and cultivating creativity and the digital culture of UC.

Cincinnati Public Radio interview

our EVRTalk story goes live on Cincinnati Public Radio!

Ming Tang (UC) and Jai’La Nored, Anna Goubeaux, and Antoinette Moore (Council on Aging of Southwestern Ohio) were interviewed by Ann Thompson. WVXU, Cincinnati Public Radio.

VR headsets put caregivers in the shoes of those they are assisting. By Ann Thompson. WVXU, Cincinnati Public Radio. 01.02.2023.

Live on Cincinnati Public Radio on January 2nd at 6:44 am. 8:44 am and 5:44 pm

(from left) Jai’La Nored, Anna Goubeaux, UC’s Ming Tang and Antoinette Moore.

Focus On Technology
Mondays at 6:44 a.m. and 8:44 a.m. during Morning Edition and 5:44 p.m. during All Things Considered.

Thanks for the support from COA, Live Well Collaborative, and the University of Cincinnati Urban Health Pathway grant.

Check more information on the EVRTalk  program.

Industry 4.0/5.0 grant

 

Immersive vs. Traditional Training​ – a comparison of training modalities​

PIs: Tamara Lorenz, Ming Tang

  • Dr. Tamara Lorenz. Associate Professor. Embodied Interactive Systems Lab, Industry 4.0 & 5.0 Institute (I45I), Center for Cognition, Action, and Perception (CAP)
  • Ming Tang. Associate Professor. Extended Reality Lab, Industry 4.0 & 5.0 Institute (I45I), Institute for Research in Sensing (IRiS)

Consortium Research Project: evaluate the effectiveness of an immersive training protocol against different traditional training modalities. 

Grant. $40,000. By UC Industry 4.0/5.0 Institute 01.2023-01.2024

Thanks to the Institute’s Industrial Advisory Board (IAB) and industry patrons, including Siemens, Kinetic Vision, John Deere, Stress Engineering Services, Innovative Numberics, and Ethicon. 

At UC News

New UC institute looks ahead to ‘Industry 5.0’. UC will harness collective talent across campus to help companies solve new challenges. by Michael Miller.  December 8, 2022

 

 

Future Service, Retail, Metaverse, and Robotics

Design for Future Service, Metaverse, and Robotics

Thanks to all students from SOD and SAID who have participated in the projects. We also want to thank the help from Kroger and UC Digital Future.

Redesign retail and other service experiences (retail space, in-person service, digital service, technology) and rethink customer/business needs. Envision the future service experience of retail, hospitality, and delivery: 1) physical, 2) digital, and 3) a combination of both. Research questions include:  What does the “future retail store” look like? How the online shopping and local stores pick up change the architecture of a store? How the metaverse and immersive experience can be integrated with retail? Can public and recreational programs be introduced into stores to enhance customers’ experience? How can robots assist employees and businesses in providing better service to customers? When robots coexist with people, how can we make robots more understandable and usable?

Collaborative design work from the College of DAAP.

Students: SAID, SOD

Faculty: Ming Tang, Yong-Gyun Ghim, College of DAAP, UC

SAID course description

This four-credit workshop course includes both a seminar and project development format, and develops techniques for digital modeling as they influence the process of viewing, visualizing, and forming spatial and formal relationships. The course encourages operational connections among different techniques for inquiry and visualization as a critical methodology in the design process.

Students: Ryan Adams, Varun Bhimanpally, Ryan Carlson, Brianna Castner, Matthew Davis, John Duke, Prajakta Sanjay Gangapurkar, Jordan Gantz, Alissa Gonda, Justin Hamilton, Emma Hill, Anneke Hoskins, Philip Hummel, Jahnavi Joshi, Patrick Leesman, Tommy Lindenschmidt, Peter Loayza, Jacob Mackin, Jordan Major, Julio Martinez, Jacob McGowan, Simon Needham, Hilda Rivera, Juvita Sajan, Gavin Sharp, Hannah Webster, Megan Welch, Meghana Yelagandula, Isaiah Zuercher.

SOD Studio description

Mobile Robotics Studio envisions how robotic technology can better assist people in service environments benefiting both customers and employees. Based on human-centered design and systems thinking approaches, cross-disciplinary teams of industrial, communication, and fashion design students created design solutions for product-service systems of mobile robots in retail, air travel, sports, delivery, and emergency services. 

Students: Noon Akathapon, Kai Bettermann, Cy Burkhart, Jian Cui, Joe Curtsinger, Emmy Gentile, Bradley Hickman, Quinn Matava, Colin McGrail, James McKenzie, Alex Mueller, John Pappalardo, Kurtis Rogers, Connor Rusnak, Jimmy Tran, Franklin Vallant, Leo von Boetticher

 

Eye Tracking Technology

1. Wearable Eye-tracking technology.

student_ET

This wearable ET device includes various components, such as illuminators, cameras, and a data collection and processing unit for image detection, 3D eye model, and gaze mapping algorithms. Compared to the screen-based ET device, the most significant differences of the wearable ET device are its binocular coverage, a field of view (FOV) and head tilt that has an impact on the glasses-configured eye-tracker.  Also, it avoids potential experimental bias resulting from the display size or pixel dimensions of the screen. Similar to the screen-based ET, the images captured by the wearable ET camera are used to identify the glints on the cornea and the pupil. This information together with a 3D eye model is then used to estimate the gaze vector and gaze point for each participant.

After standard ET calibration and verification procedure, participants were instructed to walk in a defined space while wearing the glasses. In this case, the TOI was set at 60 seconds, recording a defined start and end events with the visual occurrences over that period. In this case, data was collected for both pre-conscious (first three seconds) and conscious viewing (after 3 seconds).

cafe2_h2

DAAP_cafe2_gz

we also did the screen-based eye-tracking and compared the results.

2. Screen-based Eye-tracking technology

This method was beneficial for informing reviewers how an existing place or a proposed design was performing in terms of user experience. Moreover, while the fundamental visual elements that attract human attention and trigger conscious viewing are well-established and sometimes incorporated into signage design and placement, signs face an additional challenge because they must compete for viewers’ visual attention in the context of the visual elements of the surrounding built and natural environments. As such, tools and methods are needed that can assist “contextually-sensitive” design and placement by assessing how signs in situ capture the attention of their intended viewers.

3. VR-Based Eye-tracking technology

Eye-tracking technology enables new forms of interactions in VR, with benefits to hardware manufacturers, software developers, end users and research professionals.

Paper:

Tang. M. Analysis of Signage using Eye-Tracking Technology. Interdisciplinary Journal of Signage and Wayfinding. 02. 2020.

Tang, M. and Auffrey, C. “Advanced Digital Tools for Updating Overcrowded Rail Stations: Using Eye Tracking, Virtual Reality, and Crowd Simulation to Support Design Decision-Making.Urban Rail Transit, December 19, 2018.

Magic School Bus project

The AR & VR project for medical model. Animated heart. magic school bus project at the University of Cincinnati.

Eye-tracking analysis for train station

Some test did with Tobii eye tracker for train station design in Beijing. All heatmap here, all gaze cluster here. 15 train stations were design by students and presented through Unreal Game Engine. More information on the train station design is available in the studio website.

ARCH4002__screenshots__all_sreenshots_Page_49.png__Full_recordings__All_Participants

 

ARCH4002__screenshots__all_sreenshots_Page_30.png__Full_recordings__All_Participants

Hololens Installation

Two installations using Microsoft Hololens

Bubble

MOMO