Posts

Digital Twin of Ohio Highway. Training Simulation for snowplow

Ming Tang led the team that constructed six large-scale Ohio highway digital twins for snowplow drivers. The road models, which are 70 miles long and span three Ohio counties, were based on real-site GIS and TOPO data.

“Evaluate Opportunities to Provide Training Simulation for ODOT Snow and Ice Drivers”. Phase 2. Ohio Department of Transportation. PI: John Ash. Co-PI: Ming Tang, Frank Zhou, Mehdi Norouzi, $952,938. Grant #: 1014440. ODOT 32391 / FHWA Period: 01.2019-03.2022.

“Evaluate Opportunities to Provide Training Simulation for ODOT Snow and Ice Drivers”. Phase-1. Ohio Department of Transportation. PI: Jiaqi Ma. Co-PI: Ming Tang, Julian Wang. $39,249. ODOT 32391 / FHWA. Period: 01.2018- 01.2019.

 

 

Lorain County. District 3. 

Hoicking Mountain District 10.

City of Columbus-south A

 

City of Columbus-West B

City of Columbus-North. C

 

Publications

Raman, M., Tang, M3D Visualization Development of Urban Environments for Simulated Driving Training and VR Development in Transportation Systems. ASCE ICTD 2023 Conference. Austin. TX. 06. 2023

Collaborative teaching to study the social impact of A.I, Automation, and Robots

Funded by UC Forward Course Development grant. The Future of (no) Work and Artificial Intelligence.

(Award fund rescind due to COV-19 budget freeze). 2020

Amount: $15,000.  The courses will be offered in the Fall semester. 2020.

  • Ming Tang, Associate Professor, School of Architecture and Interior Design, DAAP
  • Cory Haberman, Assistant Professor and Director of the Institute of Crime Science, CECH
  • Tamara Lorenz, Assistant Professor, Psychology-Mechanical Engineering-Electrical Engineering (jointly appointed). Department of Psychology. College of A&S.

Self-portrait drawing. designed by Google Machine Learning named “DeepDream”. Painted by Kuka Robot. By Ming Tang. 2018

Historically, computerization has primarily been confined to manual and cognitive routine tasks involving explicit rule-based activities. With technology, computers, and robots rapidly improving in our modern age, analysts are predicting many jobs will be replaced with automation and machinery. A 2013 study by Frey and Osborne predicted that in the coming decades 47% of current jobs were at high risk of being replaced by machines. Some job sectors, namely automobile manufacturing, have already been heavily impacted by the computerization in factories. (Frey and Osborne) Many experts are arguing that several routine, low skill, and physical jobs will disappear in the coming decades as Artificial intelligence (A.I) and Robotics technology grows. Even some “none-routine, creative” jobs such as writing, art, and music will also be impacted by the computerization and algorithms. Among these non-routine jobs, there has been work and research towards “simulated human” since the Turing test. Simply put, the goal is to make output that cannot be distinguished as being created by a human or a computer.
The ability of A.I and robots to mimic human decision making will undoubtedly affect jobs in the years to come. Our team believes we are progressing into a time where A.I and human-robot collaboration are creating concurrently, and we should embrace these possibilities into our curriculum to study our perceptions of the robots in the future world and how our behaviors might be impacted by these autonomous technologies. To prepare our students for future challenges, it is essential to create a simulated future world to study how A.I, Automation, and Robots (AAR) will interact with humans and form a new symbiotic relationship.
The faculty team will break down the process of the simulated future world as a two-stage process. The first stage is using virtual reality (VR) to develop an immersive digital environment populated with advanced AI and robots to simulate future living and working environments. We will model various humanoid robots (field robots, social robots), and humanoid police robots (UAV, industrial robots). The second stage of the process is to immerse students into these future world scenarios and test human reaction towards AAR through VR. In the second stage, the faculty team will work together to acquire IRB and create data collection plan with students from the three courses. The team has investigated this two-stage approach and will set up several shared seminars and presentations to promote student dialogues in Fall semester 2020.

Team and roles

Prof. Tang has taught virtual reality, robotics for digital fabrication, and applied A.I controlled agents for wayfinding study in DAAP1. Through his ARCH 4001, 4th-year Architecture design studio, he will lead the scenario design for the future working and living space and create VR environments for the other two courses.
Prof Cory Haberman uses quantitative methods to understand spatial-temporal crime patterns. He also uses mixed-methods research to advance evidence-based policing with particular interests in crime analysis, hot spots policing and focused deterrence. Through his CJ7010-001: Seminar in Criminal Justice Course, he will lead the students to evaluate the potential of criminal behavior in the future world impacted by policing robots, UAVs, and A.I controlled surveillance systems.
Prof. Tamara Lorenz exploring the dynamics of human interaction and their relevance and usability for Human-Robot Interaction (HRI). Her focus areas are rehabilitation engineering and HRI in manufacturing. She is also interested in exploring human-machine interaction in general and with applications to human factors in daily life and work surroundings. Through her PSYC 3043 – HUMAN PERFORMANCE course, she will lead the students to study human behavior towards robots in the future working and living environment.

Objective

The goal of the proposed project is to integrate three courses from the existing curriculums to promote interdisciplinary collaboration that proactively enhances the understanding of how A.I, Automation, and Robots (AAR) can impact human behavior. We hope to teach students both human behavior study methods and let them experience a possible future world through VR. Collectively, the team will investigate how human decisions may be influenced by either the robots, autonomous environment or both.
The following objectives will be achieved through designated coursework:
Objective 1: To understand the fundamentals of AAR technology and its applications in future scenarios.
Objective 2: To investigate the human perception and behavior towards the AAR through VR.

Objective 3: To understand the symbiosis of man and A.I in the new era through human robots interaction (HRI) study.

Courses

The collaborative courses will target both undergraduate and graduate students and encourage them to explore, discover, discuss, and construct research related to AAR. Major activities associated with teaching, research, and scenario development will collaborate with three colleges, A&S, CEAS, and DAAP. Three courses from the curriculums are calibrated to formulate a coherent investigation on the future world in alignment with the designated objectives.
• Course 1: ARC4001 – capstone design studio, future world design, and VR simulation. Offered by Ming Tang in Fall Semester 2020.
• Course 2: PSYC 3043 – HUMAN PERFORMANCE, Offered by Tamara Lorenz in Fall Semester 2020.
• Course 3: CJ7010-001: Seminar in Criminal Justice. Offered by Cory Haberman in Fall Semester 2020.

An interdisciplinary approach to using eye-tracking technology for design and behavioral analysis

Eye-tracking Tobii workshop offered at CGC, DAAP, UC. 2019.

Eye-tracking devices measure eye position and eye movement, allowing documentation of how environmental elements draw the attention of viewers. In recent years, there have been a number of significant breakthroughs in eye-tracking technology, with a focus on new hardware and software applications. Wearable eye-tracking glasses together with advances in video capture and virtual reality (VR) provide advanced tracking capability at greatly reduced prices.  Given these advances, eye trackers are becoming important research tools in the fields of visual systems, design, psychology, wayfinding, cognitive science, and marketing, among others. Currently, eye-tracking technologies are not readily available at UC, perhaps because of a lack of familiarity with this new technology and how can be used for teaching and research, or the perception of a steep learning curve for its application.

It has become clear to our UC faculty team that research and teaching will significantly benefit from utilizing these cutting-edge tools. It is also clear that a collective approach to acquiring the eye-tracking hardware and software, and training faculty on its application will ultimately result in greater faculty collaboration with its consequent benefits of interdisciplinary research and teaching.

The primary goals of the proposed project are to provide new tools for research and teaching that benefit from cutting-edge eye-tracking technologies involving interdisciplinary groups of UC faculty and students. The project will enhance the knowledge base among faculty and allow new perspectives on how to integrate eye-tracking technology. It will promote interdisciplinarity the broader UC communities.

Grant:

  • “Eye-tracking technology”. Provost Group / Interdisciplinary Award. $19,940. PI: Tang. Co-PI: Auffrey, Hildebrandt. UC. Faculty team: Adam Kiefer,  Michael A. Riley, Julian Wang, Jiaqi Ma, Juan Antonio Islas Munoz, Joori Suh. 2018
  • DAAP matching garnt  $ 11,000.

Hardware: Tobii Pro Glasses, Tobii Pro VR

Software: Tobii Pro Lab, Tobii VR analysis

Test with Tobii Pro Glasses

urban mobility studio

Grant: “Project-Based Collaborative Coursework for Developing Connected Transportation Network and Accessible Multimodal Hub in Uptown”. UC Forward grant. Co-PI: Heng Wei, Na Chen, Xinhao Wang, Jiaqi Ma, Ming Tang. $5,000. Total $27,500.

ARCH4001. Fall. 2018. SAID, DAAP, UC.

Faculty: Ming Tang, RA, LEED AP, Associate Professor. UC

Using Cincinnati Uptown and proposed Smart Corridor area as the focus area, the studio presents a study investigating the urban mobility with an emphasis on the simulated human behavior cues and movement information as input parameters. The research is defined as a hybrid method which seeks logical architecture/urban forms and analyzes its’ performance. As one of the seven-courses-clusters supported by UC Forward, the studio project extends urban mobility study by exploring, collecting, analyzing, and visualizing geospatial information and physically representing the information through various computational technologies.
The studio investigation is intended to realize the potential of quantifying demographic, social, and behavior data into a parametric equation. In the experiments, the integration of non-geometrical parameters within the form seeking and performance evaluation process resulted in a series of a conceptual model to represent the movement and access. The projects will be developed by optimizing transportation network, analyzing way-finding and human behavior. Ultimately, the studio looks to build upon the strengths pre-defined in the evaluation method and capture the benefits of Geographic Information System (GIS), virtual reality (VR), eye-tracking, and wayfinding simulation by seamlessly integrating vital geospatial components in the equation and altering the way people explore the possible design solutions in order to generate the ideal urban and building forms.

Students: Nolan Dalman, Sam DeZarn, Nicole Powers, Jake Miller, Hang Phan, Josh Funderburk, Rugui Xie, Nick Mann, Azrien Isaac, Shiyuan li, Spencer Kuehl, Randall Morgan, Greg Ginley, Umme Habiba

 

UC Forward Collaborative on Smart Transportation Forum at Niehoff studio

Fall 2018 Urban Mobility studio presented at the Uptown Innovation Transportation Corridor Forum 04.31.2019, featured by UC News. UC students present future of transportation at forum. 2019

More info on the studio and the student projects.

 

Mixed Reality for medical data

Grant: “Magic School Bus for Computational Cell”,  the Arts, Humanities, & Social Sciences and Integrated Research Advancement Grants. (AHSS)  UC.  $8,550. PI: Tang. Co-PI: Zhang. T. 2016

The AR & VR project for medical model. Animated heart. magic school bus project at the University of Cincinnati.

 

Funded by the 2017 AHSS and Integrated Research Advancement Grant at UC. Magic School bus for Computational Cell” project constructed a mixed reality visualization at the College of DAAP and College of Medicine by integrating virtual reality (VR) and augmented reality (AR) for molecular and cellular physiology research. The project employed state-of-the-art VR and AR software and hardware, which allows for creative approaches using holographic imaging and computer simulation. This project expanded our cutting-edge research in space modeling & architecture visualization to the new computational cell field, including the creation of 3D models of the intestine tubes, and envisioning cell changes through agent-based simulation.

PI: Ming Tang. Associate Professor. School of Architecture & Interior Design, College of DAAP.

Co-PI:Tongli Zhang. PhD. Assistant Professor. Department of Molecular and Cellular Physiology. College of Medicine.

Data Managment: Tiffany Grant. PhD. Research Informationist. Health Sciences Library. College of Medicine.

the web3D model is here.