our EVRTalk story goes live on Cincinnati Public Radio!
Ming Tang (UC) and Jai’La Nored, Anna Goubeaux, and Antoinette Moore (Council on Aging of Southwestern Ohio) were interviewed by Ann Thompson. WVXU, Cincinnati Public Radio.
https://ming3d.com/VR/wp-content/uploads/2022/01/logo_XR_newweb.png00Ming Tanghttps://ming3d.com/VR/wp-content/uploads/2022/01/logo_XR_newweb.pngMing Tang2023-01-03 04:30:092023-01-03 04:30:51Cincinnati Public Radio interview
Design in the Age of Metaverse and Extended Reality
ARCH 7036-004 (51920) / ARCH5051-004 (51921). Elective Arch Theory Seminar. Spring semester. DAAP, UC.
Class time: Tuesdays and Thursdays. 12:30 pm-1:50 pm
Location: Hybrid with online; CGC Lab @ DAAP; XR-Lab @ Digital Future Building.
Instructor: Ming Tang. Director, Extended Reality Lab. XR-Lab, Associate Professor, SAID, DAAP, University of Cincinnati
The course is open to all UC students of any major. The course is open to both undergraduate (ARCH5051) and graduate students (ARCH 7036). Please contact Prof. Ming Tang if you have issues registering for the course as a student from another major.
For SAID graduate student who has previously taken the VIZ-3 course, this advanced-level course will allow you to develop your XR-based presentations for thesis exhibition or propose your XR-related topics.
For students with general design backgrounds ( architecture, interior design, urban planning, landscape, industrial design, art, game, film, and digital media), this course will allow you to extend your creativity into XR and create an immersive experience.
For students without previous experience in design fields (engineering, art & science, CCM, medicine, etc.) this course will be an excellent introduction to the realm of XR.
This seminar course focuses on the intersection of traditional design with immersive visualization technologies, including Virtual Reality, Augmented Reality, Digital Twin, and Metaverse. The class will explore the new spatial experience in the virtual realm and analyze human perceptions, interactions, and behavior in the virtual world. Students will learn both the theoretical framework and hands-on skills on XR development. The course will provide students exposure to the XR technologies, supported by the new XR-Lab at the UC digital future building. Students are encouraged to propose their own or group research on future design with XR.
Hardware: VR headsets and glasses are available at the XR-Lab at DF.
Students are encouraged to look at architecture, motion pictures, game design, 3D design, and other digital art media. As students develop their research, these fields become invaluable in inspiration and motivation. The instructor supports all attempts by students to experiment with techniques that interest and challenge them above and beyond what is taught in the classroom.
Students completing this course will be able to identify theories and research methods related to eXtended Reality (XR), including (1) History and theory related to XR (VR+AR+MR); (2) Emerging trend related to the Metaverse and its impact on design.
The course focus on hands-on skills in XR development through Unreal Engine 5. Students will propose their XR projects and develop them over the semester.
This course will be offered as a hybrid of online and in-person at CGC/Digital Future building based on the project’s progress. The student will propose their projects and use 15 weeks to complete the development.
Prerequisite: There are no skills needed as prerequisites. However, Basic 3D modeling skills are preferred.
Schedule for each week.
History and the workflow of metaverse development
project management at GitHub (Inspiration from the film due)
https://ming3d.com/VR/wp-content/uploads/2022/01/logo_XR_newweb.png00Ming Tanghttps://ming3d.com/VR/wp-content/uploads/2022/01/logo_XR_newweb.pngMing Tang2023-01-03 04:30:092023-01-03 04:30:09call for enrollment: Metaverse development
eXtended Reality Lab (XR-Lab) is a research lab affiliated with the College of DAAP and Digital Futures at the University of Cincinnati. XR-Lab explores how people engage with simulated scenarios through Virtual Reality, Augmented Reality, and Mixed Reality and promotes communication and collaboration through XR for education, health, safety, and professional training in both academic and industry settings.
Technologies have radically altered and reconstructed the relationships between humans, computation, and the environment. As the diagram shows, we believe AI, Digital twins, Metaverse, and XR are essential for fusing design and computation to connect humans. These technologies provide promising opportunities to empower us to rebuild a better world, physically and digitally.
We look forward to sharing our expertise and lab resources, incubating new ideas, and cultivating creativity and the digital culture of UC.
https://ming3d.com/VR/wp-content/uploads/2022/01/logo_XR_newweb.png00Ming Tanghttps://ming3d.com/VR/wp-content/uploads/2022/01/logo_XR_newweb.pngMing Tang2023-01-03 04:30:092023-01-03 04:30:09new website of XR-Lab
Immersive vs. Traditional Training – a comparison of training modalities
PIs: Tamara Lorenz, Ming Tang
Dr. Tamara Lorenz. Associate Professor. Embodied Interactive Systems Lab, Industry 4.0 & 5.0 Institute (I45I), Center for Cognition, Action, and Perception (CAP)
Ming Tang. Associate Professor. Extended Reality Lab, Industry 4.0 & 5.0 Institute (I45I), Institute for Research in Sensing (IRiS)
Consortium Research Project: evaluate the effectiveness of an immersive training protocol against different traditional training modalities.
Thanks to the Institute’s Industrial Advisory Board (IAB) and industry patrons, including Siemens, Kinetic Vision, John Deere, Stress Engineering Services, Innovative Numberics, and Ethicon.
Design for Future Service, Metaverse, and Robotics
Thanks to all students from SOD and SAID who have participated in the projects. We also want to thank the help from Kroger and UC Digital Future.
Redesign retail and other service experiences (retail space, in-person service, digital service, technology) and rethink customer/business needs. Envision the future service experience of retail, hospitality, and delivery: 1) physical, 2) digital, and 3) a combination of both. Research questions include: What does the “future retail store” look like? How the online shopping and local stores pick up change the architecture of a store? How the metaverse and immersive experience can be integrated with retail? Can public and recreational programs be introduced into stores to enhance customers’ experience? How can robots assist employees and businesses in providing better service to customers? When robots coexist with people, how can we make robots more understandable and usable?
Collaborative design work from the College of DAAP.
This four-credit workshop course includes both a seminar and project development format, and develops techniques for digital modeling as they influence the process of viewing, visualizing, and forming spatial and formal relationships. The course encourages operational connections among different techniques for inquiry and visualization as a critical methodology in the design process.
Students: Ryan Adams, Varun Bhimanpally, Ryan Carlson, Brianna Castner, Matthew Davis, John Duke, Prajakta Sanjay Gangapurkar, Jordan Gantz, Alissa Gonda, Justin Hamilton, Emma Hill, Anneke Hoskins, Philip Hummel, Jahnavi Joshi, Patrick Leesman, Tommy Lindenschmidt, Peter Loayza, Jacob Mackin, Jordan Major, Julio Martinez, Jacob McGowan, Simon Needham, Hilda Rivera, Juvita Sajan, Gavin Sharp, Hannah Webster, Megan Welch, Meghana Yelagandula, Isaiah Zuercher.
SOD Studio description
Mobile Robotics Studio envisions how robotic technology can better assist people in service environments benefiting both customers and employees. Based on human-centered design and systems thinking approaches, cross-disciplinary teams of industrial, communication, and fashion design students created design solutions for product-service systems of mobile robots in retail, air travel, sports, delivery, and emergency services.
Students: Noon Akathapon, Kai Bettermann, Cy Burkhart, Jian Cui, Joe Curtsinger, Emmy Gentile, Bradley Hickman, Quinn Matava, Colin McGrail, James McKenzie, Alex Mueller, John Pappalardo, Kurtis Rogers, Connor Rusnak, Jimmy Tran, Franklin Vallant, Leo von Boetticher
https://ming3d.com/VR/wp-content/uploads/2022/01/logo_XR_newweb.png00Ming Tanghttps://ming3d.com/VR/wp-content/uploads/2022/01/logo_XR_newweb.pngMing Tang2023-01-03 04:30:082023-01-03 04:30:08Future Service, Retail, Metaverse, and Robotics
This wearable ET device includes various components, such as illuminators, cameras, and a data collection and processing unit for image detection, 3D eye model, and gaze mapping algorithms. Compared to the screen-based ET device, the most significant differences of the wearable ET device are its binocular coverage, a field of view (FOV) and head tilt that has an impact on the glasses-configured eye-tracker. Also, it avoids potential experimental bias resulting from the display size or pixel dimensions of the screen. Similar to the screen-based ET, the images captured by the wearable ET camera are used to identify the glints on the cornea and the pupil. This information together with a 3D eye model is then used to estimate the gaze vector and gaze point for each participant.
After standard ET calibration and verification procedure, participants were instructed to walk in a defined space while wearing the glasses. In this case, the TOI was set at 60 seconds, recording a defined start and end events with the visual occurrences over that period. In this case, data was collected for both pre-conscious (first three seconds) and conscious viewing (after 3 seconds).
we also did the screen-based eye-tracking and compared the results.
2. Screen-based Eye-tracking technology
This method was beneficial for informing reviewers how an existing place or a proposed design was performing in terms of user experience. Moreover, while the fundamental visual elements that attract human attention and trigger conscious viewing are well-established and sometimes incorporated into signage design and placement, signs face an additional challenge because they must compete for viewers’ visual attention in the context of the visual elements of the surrounding built and natural environments. As such, tools and methods are needed that can assist “contextually-sensitive” design and placement by assessing how signs in situ capture the attention of their intended viewers.
3. VR-Based Eye-tracking technology
Eye-tracking technology enables new forms of interactions in VR, with benefits to hardware manufacturers, software developers, end users and research professionals.
The AR & VR project for medical model. Animated heart. magic school bus project at the University of Cincinnati.
https://ming3d.com/VR/wp-content/uploads/2022/01/logo_XR_newweb.png00Ming Tanghttps://ming3d.com/VR/wp-content/uploads/2022/01/logo_XR_newweb.pngMing Tang2018-09-05 00:10:182018-09-05 00:11:08Magic School Bus project
Some test did with Tobii eye tracker for train station design in Beijing. All heatmap here, all gaze cluster here. 15 train stations were design by students and presented through Unreal Game Engine. More information on the train station design is available in the studio website.
https://ming3d.com/VR/wp-content/uploads/2022/01/logo_XR_newweb.png00Ming Tanghttps://ming3d.com/VR/wp-content/uploads/2022/01/logo_XR_newweb.pngMing Tang2018-04-04 14:53:582018-04-04 14:56:26Eye-tracking analysis for train station
use eye tracking to measure the attention and gaze of a viewer on construction sequence. The construction was made in Prof. Tang’s second year SAID class.
https://ming3d.com/VR/wp-content/uploads/2022/01/logo_XR_newweb.png00Ming Tanghttps://ming3d.com/VR/wp-content/uploads/2022/01/logo_XR_newweb.pngMing Tang2018-02-18 18:56:142018-02-18 18:59:46attention measurment of construction sequence