new website of XR-Lab

XR-Lab’s new website is completed.

eXtended Reality Lab (XR-Lab) is a research lab affiliated with the College of DAAP and Digital Futures at the University of Cincinnati.  XR-Lab explores how people engage with simulated scenarios through Virtual Reality, Augmented Reality, and Mixed Reality and promotes communication and collaboration through XR for education, health, safety, and professional training in both academic and industry settings.

 


Technologies have radically altered and reconstructed the relationships between humans, computation, and the environment. As the diagram shows, we believe AI, Digital twins, Metaverse, and XR are essential for fusing design and computation to connect humans. These technologies provide promising opportunities to empower us to rebuild a better world, physically and digitally.

We look forward to sharing our expertise and lab resources, incubating new ideas, and cultivating creativity and the digital culture of UC.

Future Service, Retail, Metaverse, and Robotics

Design for Future Service, Metaverse, and Robotics

Thanks to all students from SOD and SAID who have participated in the projects. We also want to thank the help from Kroger and UC Digital Future.

Redesign retail and other service experiences (retail space, in-person service, digital service, technology) and rethink customer/business needs. Envision the future service experience of retail, hospitality, and delivery: 1) physical, 2) digital, and 3) a combination of both. Research questions include:  What does the “future retail store” look like? How the online shopping and local stores pick up change the architecture of a store? How the metaverse and immersive experience can be integrated with retail? Can public and recreational programs be introduced into stores to enhance customers’ experience? How can robots assist employees and businesses in providing better service to customers? When robots coexist with people, how can we make robots more understandable and usable?

Collaborative design work from the College of DAAP.

Students: SAID, SOD

Faculty: Ming Tang, Yong-Gyun Ghim, College of DAAP, UC

SAID course description

This four-credit workshop course includes both a seminar and project development format, and develops techniques for digital modeling as they influence the process of viewing, visualizing, and forming spatial and formal relationships. The course encourages operational connections among different techniques for inquiry and visualization as a critical methodology in the design process.

Students: Ryan Adams, Varun Bhimanpally, Ryan Carlson, Brianna Castner, Matthew Davis, John Duke, Prajakta Sanjay Gangapurkar, Jordan Gantz, Alissa Gonda, Justin Hamilton, Emma Hill, Anneke Hoskins, Philip Hummel, Jahnavi Joshi, Patrick Leesman, Tommy Lindenschmidt, Peter Loayza, Jacob Mackin, Jordan Major, Julio Martinez, Jacob McGowan, Simon Needham, Hilda Rivera, Juvita Sajan, Gavin Sharp, Hannah Webster, Megan Welch, Meghana Yelagandula, Isaiah Zuercher.

Final presentation videos.

SOD Studio description

Mobile Robotics Studio envisions how robotic technology can better assist people in service environments benefiting both customers and employees. Based on human-centered design and systems thinking approaches, cross-disciplinary teams of industrial, communication, and fashion design students created design solutions for product-service systems of mobile robots in retail, air travel, sports, delivery, and emergency services. 

Students: Noon Akathapon, Kai Bettermann, Cy Burkhart, Jian Cui, Joe Curtsinger, Emmy Gentile, Bradley Hickman, Quinn Matava, Colin McGrail, James McKenzie, Alex Mueller, John Pappalardo, Kurtis Rogers, Connor Rusnak, Jimmy Tran, Franklin Vallant, Leo von Boetticher

 

 

Industry 4.0/5.0 grant

 

Immersive vs. Traditional Training​ – a comparison of training modalities​

PIs: Tamara Lorenz, Ming Tang

  • Dr. Tamara Lorenz. Associate Professor. Embodied Interactive Systems Lab, Industry 4.0 & 5.0 Institute (I45I), Center for Cognition, Action, and Perception (CAP)
  • Ming Tang. Professor. Extended Reality Lab, Industry 4.0 & 5.0 Institute (I45I), Institute for Research in Sensing (IRiS)

Consortium Research Project: evaluate the effectiveness of an immersive training protocol against different traditional training modalities. 

Grant. $40,000. By UC Industry 4.0/5.0 Institute 01.2023-01.2024

Open Questions

  • Is immersive training equally as effective or better than traditional training? 
  • Is immersive training beneficial for specific types of training (skill, behavior), while other modalities are better for other types (e.g. knowledge acquisition)?
  • Does the benefit of immersive VR training warrant the initial investment in equipment and subsequent investment in project building, running, and sustenance?

Proposal

  • Evaluation of the effectiveness of an immersive training protocol against different traditional training modalities. 
  • Evaluation of modality-dependent benefits for different learning goals. 
  • Derivation of assessment metrics for VR training against other training modalities. 

Training scenario: DAAP Fire Evacuation

traditional training with slides and maps.

VR training with an immersive and interactive experience.

 

 

Thanks to the Institute’s Industrial Advisory Board (IAB) and industry patrons, including Siemens, Kinetic Vision, John Deere, Stress Engineering Services, Innovative Numberics, and Ethicon. 

Next Phase experiments

Multi-player test



 

Links

 2017 Virtual DAAP Fire Evacuation project.

 

At UC News

New UC institute looks ahead to ‘Industry 5.0’. UC will harness collective talent across campus to help companies solve new challenges. by Michael Miller.  December 8, 2022

 

 

paper on IJSW

Ming Tang and Adekunle Adebisi’s paper titled Using Eye-Tracking for Traffic Control Signage Design at Highway Work Zone is published in the Interdisciplinary Journal of Signage and Wayfinding. 

Tang, M. Adebisib, A. Using Eye-Tracking for Traffic Control Signage Design at Highway Work Zone. Interdisciplinary Journal of Signage and Wayfinding.  Vol. 6, No. 2 (2022)

This paper discusses the application of Eye Tracking (ET) technologies for researchers to understand a driver’s perception of signage at the highway work zone. Combining ET with screen-based motion pictures and a driving simulator, the team developed an analytical method that allowed designers to evaluate signage design. Two experiments were set up to investigate how signage design might affect a driver’s visual attention and interaction under various environmental complexities and glare conditions. The study explores visual perception related to several spatial features, including signage modality, scene complexity, and color schemes. The ET method utilizes total fixation time and time-to-first fixation data to evaluate the effectiveness of signages presented through screen-based video and a driving simulator.

Keywords: Eye-tracking, Signage design, Work zone safety

about the IJSW journal

Signage and wayfinding are critical components of the urban landscape. In spite of their importance, there has been no journal or comprehensive scholarly platform dedicated to this topic. As such, scholars from a variety of academic disciplines (law, planning, engineering, business, art, economics, architecture, landscape architecture, industrial design, and graphic design) publish work in journals within their home disciplines and rarely have a chance to communicate their cross-disciplinary findings. The Interdisciplinary Journal of Signage and Wayfinding seeks to bring them together.

Sponsored by the Academic Advisory Council for Signage Research and Education (AACSRE), this online, open access journal seeks to be the home for scholarship in the field of signage and wayfinding, and to make such scholarship accessible to academics and practitioners alike.

o4a AAA Partnership Award

2022 Outstanding AAA Partnership Award of the Year

 

On behalf of COA and Live Well, Ken Wilson (COA) and Ming Tang (UC)  received the AAA Award at the o4a conference. 10.20.2022. It is my great honor to represent Live Well as the co-recipient with the Council on Aging to receive the 2022 Ohio Association of Area Agencies on Aging Annual Partnership Award. Thanks to Suzanne Burke, Ken Wilson, Jai’La Nored, Anna Goubeaux, and many others from COA. Thanks to the Live Well EVRTalk development team (Faculty: Ming Tang, Matt Anthony; advisor: Craig Vogel, Linda Dunseath; Students and Live Well fellows: Tosha Bapat, Karly Camerer, Jay Heyne, Harper Lamb, Jordan Owens, Ruby Qji, Alejandro Robledo, Matthew Spoleti, Lauren Southwood, Ryan Tinney, Keeton Yost, Dongrui Zhu.)

Link: LWC Twitter