Exhibition @ Reed Gallery

Reed Gallery featuring work by DAAP faculty and staff. 02.28.2022

Human and Machine Symbiosis. Portrait drawing with A.I and Robot. by Ming Tang.

Check out the  3D tour of the virtual gallery.

This work explores how Artificial Intelligence and Robots interact with humans and form a unique symbiotic relationship in painting. The co-creation follows steps:  1. Used Machine Learning and Deep Neural Network algorithm to translate a picture of a human face into a styled image. 2. Translate the face’s representation into a parametrically controlled toolpath for the KUKA robot 3. A KUKA robot executed the art-making by holding a paintbrush.

Please check out the full paper on the art-making process here, published at the 2022  HCI conference.

Drawing process:

paper @ HCI Conference

Ming Tang’s paper Human and Machine Symbiosis. An experiment of human and robot co-creation of calligraphy-style drawing is published at the HCI INTERNATIONAL 2022, 24th International Conference on  Human-Computer Interaction. HCI International 2022 Proceeding. Communications in Computer and Information Science, vol 1580. Springer, Cham.

Tang, M. (2022). Human and Machine Symbiosis – An Experiment of Human and Robot Co-creation of Calligraphy-Style Drawing. In: Stephanidis, C., Antona, M., Ntoa, S. (eds) HCI International 2022 Posters. HCII 2022. Communications in Computer and Information Science, vol 1580. Springer, Cham.

Artificial Intelligence (AI) and robots impact creative jobs such as art-making. There have been many AI tools assisting average users in imitating the styles of renowned painters from the past. The Convolutional neural network (CNN) and generative adversarial network (GAN) emergence as a method to “hallucinate” and create expressions of styled drawings. This paper discussed an experiment to study how AI, Automation, and Robots (AAR) will interact with humans and form a unique symbiotic relationship in art-making. Our project, called “robot painter,” established a co-creation in calligraphy-style painting with the following steps:  (1) Use CNN tools to translate a raster image into a calligraphy-style image. (2) Develop an algorithm in Grasshopper and Rhino program for the Kuka robot. This generative tool allowed the artist to translate the image into a parametrically controlled 3D toolpath for a robotic arm. (3) A KUKA robot executed the art-making by holding a paintbrush and completing the painting with customized stoke, force, and angle on a canvas.

In conclusion, the paper discussed that AAR makes human intervention and co-creation possible. The ability of A. I and robots to mimic artists’ expressions have undoubtedly achieved a convincing level and will affect art-making in the years to come.

Project detail. Robotic Art


Collaborative teaching to study the social impact of A.I, Automation, and Robots

Funded by UC Forward Course Development grant. The Future of (no) Work and Artificial Intelligence.

(Award fund rescind due to COV-19 budget freeze). 2020

Amount: $15,000.  The courses will be offered in the Fall semester. 2020.

  • Ming Tang, Associate Professor, School of Architecture and Interior Design, DAAP
  • Cory Haberman, Assistant Professor and Director of the Institute of Crime Science, CECH
  • Tamara Lorenz, Assistant Professor, Psychology-Mechanical Engineering-Electrical Engineering (jointly appointed). Department of Psychology. College of A&S.

Self-portrait drawing. designed by Google Machine Learning named “DeepDream”. Painted by Kuka Robot. By Ming Tang. 2018

Historically, computerization has primarily been confined to manual and cognitive routine tasks involving explicit rule-based activities. With technology, computers, and robots rapidly improving in our modern age, analysts are predicting many jobs will be replaced with automation and machinery. A 2013 study by Frey and Osborne predicted that in the coming decades 47% of current jobs were at high risk of being replaced by machines. Some job sectors, namely automobile manufacturing, have already been heavily impacted by the computerization in factories. (Frey and Osborne) Many experts are arguing that several routine, low skill, and physical jobs will disappear in the coming decades as Artificial intelligence (A.I) and Robotics technology grows. Even some “none-routine, creative” jobs such as writing, art, and music will also be impacted by the computerization and algorithms. Among these non-routine jobs, there has been work and research towards “simulated human” since the Turing test. Simply put, the goal is to make output that cannot be distinguished as being created by a human or a computer.
The ability of A.I and robots to mimic human decision making will undoubtedly affect jobs in the years to come. Our team believes we are progressing into a time where A.I and human-robot collaboration are creating concurrently, and we should embrace these possibilities into our curriculum to study our perceptions of the robots in the future world and how our behaviors might be impacted by these autonomous technologies. To prepare our students for future challenges, it is essential to create a simulated future world to study how A.I, Automation, and Robots (AAR) will interact with humans and form a new symbiotic relationship.
The faculty team will break down the process of the simulated future world as a two-stage process. The first stage is using virtual reality (VR) to develop an immersive digital environment populated with advanced AI and robots to simulate future living and working environments. We will model various humanoid robots (field robots, social robots), and humanoid police robots (UAV, industrial robots). The second stage of the process is to immerse students into these future world scenarios and test human reaction towards AAR through VR. In the second stage, the faculty team will work together to acquire IRB and create data collection plan with students from the three courses. The team has investigated this two-stage approach and will set up several shared seminars and presentations to promote student dialogues in Fall semester 2020.

Team and roles

Prof. Tang has taught virtual reality, robotics for digital fabrication, and applied A.I controlled agents for wayfinding study in DAAP1. Through his ARCH 4001, 4th-year Architecture design studio, he will lead the scenario design for the future working and living space and create VR environments for the other two courses.
Prof Cory Haberman uses quantitative methods to understand spatial-temporal crime patterns. He also uses mixed-methods research to advance evidence-based policing with particular interests in crime analysis, hot spots policing and focused deterrence. Through his CJ7010-001: Seminar in Criminal Justice Course, he will lead the students to evaluate the potential of criminal behavior in the future world impacted by policing robots, UAVs, and A.I controlled surveillance systems.
Prof. Tamara Lorenz exploring the dynamics of human interaction and their relevance and usability for Human-Robot Interaction (HRI). Her focus areas are rehabilitation engineering and HRI in manufacturing. She is also interested in exploring human-machine interaction in general and with applications to human factors in daily life and work surroundings. Through her PSYC 3043 – HUMAN PERFORMANCE course, she will lead the students to study human behavior towards robots in the future working and living environment.


The goal of the proposed project is to integrate three courses from the existing curriculums to promote interdisciplinary collaboration that proactively enhances the understanding of how A.I, Automation, and Robots (AAR) can impact human behavior. We hope to teach students both human behavior study methods and let them experience a possible future world through VR. Collectively, the team will investigate how human decisions may be influenced by either the robots, autonomous environment or both.
The following objectives will be achieved through designated coursework:
Objective 1: To understand the fundamentals of AAR technology and its applications in future scenarios.
Objective 2: To investigate the human perception and behavior towards the AAR through VR.

Objective 3: To understand the symbiosis of man and A.I in the new era through human robots interaction (HRI) study.


The collaborative courses will target both undergraduate and graduate students and encourage them to explore, discover, discuss, and construct research related to AAR. Major activities associated with teaching, research, and scenario development will collaborate with three colleges, A&S, CEAS, and DAAP. Three courses from the curriculums are calibrated to formulate a coherent investigation on the future world in alignment with the designated objectives.
• Course 1: ARC4001 – capstone design studio, future world design, and VR simulation. Offered by Ming Tang in Fall Semester 2020.
• Course 2: PSYC 3043 – HUMAN PERFORMANCE, Offered by Tamara Lorenz in Fall Semester 2020.
• Course 3: CJ7010-001: Seminar in Criminal Justice. Offered by Cory Haberman in Fall Semester 2020.

Robotic drawing

Robotic controlled drawing. some experiments at DAAP, UC. Check more info at Robotic Lab. 



Robotic Drawing and Milling

Experiment using Kuka Robot to draw and mill. Check more info at Robotic Lab. 

Check more information at Robotic Lab