Digital Human
The digital human project explores (1) high-fidelity digital human modeling and motion capture including Full body + Facial motion capture. (2) Large Language Model integration with digital human, powered by ChatGPT and Open AI.
The digital human project explores (1) high-fidelity digital human modeling and motion capture including Full body + Facial motion capture. (2) Large Language Model integration with digital human, powered by ChatGPT and Open AI.
Reed Gallery featuring work by DAAP faculty and staff. 02.28.2022
Human and Machine Symbiosis. Portrait drawing with A.I and Robot. by Ming Tang.
Check out the 3D tour of the virtual gallery.
This work explores how Artificial Intelligence and Robots interact with humans and form a unique symbiotic relationship in painting. The co-creation follows steps: 1. Used Machine Learning and Deep Neural Network algorithm to translate a picture of a human face into a styled image. 2. Translate the face’s representation into a parametrically controlled toolpath for the KUKA robot 3. A KUKA robot executed the art-making by holding a paintbrush.
Please check out the full paper on the art-making process here, published at the 2022 HCI conference.
Ming Tang’s paper Human and Machine Symbiosis. An experiment of human and robot co-creation of calligraphy-style drawing is published at the HCI INTERNATIONAL 2022, 24th International Conference on Human-Computer Interaction. HCI International 2022 Proceeding. Communications in Computer and Information Science, vol 1580. Springer, Cham. https://doi.org/10.1007/978-3-031-06417-3_62

Tang, M. (2022). Human and Machine Symbiosis – An Experiment of Human and Robot Co-creation of Calligraphy-Style Drawing. In: Stephanidis, C., Antona, M., Ntoa, S. (eds) HCI International 2022 Posters. HCII 2022. Communications in Computer and Information Science, vol 1580. Springer, Cham. https://doi.org/10.1007/978-3-031-06417-3_62
Artificial Intelligence (AI) and robots impact creative jobs such as art-making. There have been many AI tools assisting average users in imitating the styles of renowned painters from the past. The Convolutional neural network (CNN) and generative adversarial network (GAN) emergence as a method to “hallucinate” and create expressions of styled drawings. This paper discussed an experiment to study how AI, Automation, and Robots (AAR) will interact with humans and form a unique symbiotic relationship in art-making. Our project, called “robot painter,” established a co-creation in calligraphy-style painting with the following steps: (1) Use CNN tools to translate a raster image into a calligraphy-style image. (2) Develop an algorithm in Grasshopper and Rhino program for the Kuka robot. This generative tool allowed the artist to translate the image into a parametrically controlled 3D toolpath for a robotic arm. (3) A KUKA robot executed the art-making by holding a paintbrush and completing the painting with customized stoke, force, and angle on a canvas.
In conclusion, the paper discussed that AAR makes human intervention and co-creation possible. The ability of A. I and robots to mimic artists’ expressions have undoubtedly achieved a convincing level and will affect art-making in the years to come.


Grant received: 
Funded by UC Forward Course Development grant. The Future of (no) Work and Artificial Intelligence.
(Award fund rescind due to COV-19 budget freeze). 2020
Amount: $15,000. The courses will be offered in the Fall semester. 2020.

Self-portrait drawing. designed by Google Machine Learning named “DeepDream”. Painted by Kuka Robot. By Ming Tang. 2018
