Ihsan Community Center Design and Renovation

Grant. Ihsan Community Center Design. Sponsored studio. $5,000. PI: Tang

The “Ihsan Community Center Design and Renovation” is a sponsored studio by Ihsan community. The studio explored design possibilities and engaged the Ihsan community in developing future visions. The redesign and renovation of the current Ihsan community center emphasized logical architectural space, including prayer area, classrooms for educational programs, multi-purpose hall, and outdoor recreation. The project was developed by optimizing the current structure, space reorganization, and looking to build a vision for the land and building that will bring a sense of pride to the Islamic architecture and will encourage participation from youth and adults alike.

DAAP team: Kristian Van Wiel, Maddie Cooke, Nick Chism, Seanna LaGanke, Ummul Buhari-Mohammed, Amy Cui. Faculty: Ming Tang. Advisors: Xiangbin Xu, Yongquan Chen

UC DAAP students presented their design to the Ihsan Community members. Photograph by Xiangbin Xu

 

 

Community engagement.

presentation at the Ihsan community.

The proposed design was modeled in Autodesk BIM 360 + Revit, and evaluated through the immsersive VR with Occulus Quest.

 

Thanks to all the supports from the Ihsan Community, the School of Architecture and Interior Design, and Autodesk. Thanks to the field trip with the Islamic Center of Greater Cincinnati (ICGC), Clifton Mosque, and Islamic Center of Mason.

For more information on this project, please contact Ihsan Community or Prof. Ming Tang at UC DAAP.

Paper at Artificial Realities Conference

 

Cyber-Physical Experiences: Architecture as Interface

Turan Akman and Ming Tang’s Paper Cyber-Physical Experiences: Architecture as Interface was presented at the Artificial Realities: Virtual as an Aesthetic Medium for Architectural Ideation Symposium in Lisbon, Portugal. 2019.

Abstract:

Conventionally, architects have relied on qualities of elements such as materiality, light, solids and voids, etc. to break out of the static nature of space, and enhance the way users experience and perceive architecture. Even though some of these elements and methods helped create more dynamic spaces, architecture is still bound by conventional constraints of the discipline. With the introduction of technologies such as augmented reality(AR), it is becoming easier to blend digital, and physical realities, and create new types of spatial qualities and experiences, especially when it is combined with virtual reality(VR) early in the design process. Even though these emerging technologies cannot replace the primary and conventional qualitative elements in architecture, they can be used to supplement and enhance the experience and qualities architecture provides.

To explore how AR can enhance the way architecture is experienced and perceived, and how VR can be used to enhance the effects of these AR additions, the authors proposed a hybrid museum which integrated AR with conventional analog methods(e.g., materiality, light, etc.) to mediate spatial experiences. To evaluate the proposed space, the authors also created a VR walkthrough and collected quantifiable data on the spatial effects of these AR additions.

Akman,T.Tang,M. Cyber-Physical Experiences: Architecture as Interface at Artificial Realities: Virtual as an Aesthetic Medium for Architectural Ideation Symposium, Lisbon Portugal. 2019

 

Collaborative teaching to study the social impact of A.I, Automation, and Robots

Funded by UC Forward Course Development grant. The Future of (no) Work and Artificial Intelligence.

(Award fund rescind due to COV-19 budget freeze). 2020

Amount: $15,000.  The courses will be offered in the Fall semester. 2020.

  • Ming Tang, Associate Professor, School of Architecture and Interior Design, DAAP
  • Cory Haberman, Assistant Professor and Director of the Institute of Crime Science, CECH
  • Tamara Lorenz, Assistant Professor, Psychology-Mechanical Engineering-Electrical Engineering (jointly appointed). Department of Psychology. College of A&S.

Self-portrait drawing. designed by Google Machine Learning named “DeepDream”. Painted by Kuka Robot. By Ming Tang. 2018

Historically, computerization has primarily been confined to manual and cognitive routine tasks involving explicit rule-based activities. With technology, computers, and robots rapidly improving in our modern age, analysts are predicting many jobs will be replaced with automation and machinery. A 2013 study by Frey and Osborne predicted that in the coming decades 47% of current jobs were at high risk of being replaced by machines. Some job sectors, namely automobile manufacturing, have already been heavily impacted by the computerization in factories. (Frey and Osborne) Many experts are arguing that several routine, low skill, and physical jobs will disappear in the coming decades as Artificial intelligence (A.I) and Robotics technology grows. Even some “none-routine, creative” jobs such as writing, art, and music will also be impacted by the computerization and algorithms. Among these non-routine jobs, there has been work and research towards “simulated human” since the Turing test. Simply put, the goal is to make output that cannot be distinguished as being created by a human or a computer.
The ability of A.I and robots to mimic human decision making will undoubtedly affect jobs in the years to come. Our team believes we are progressing into a time where A.I and human-robot collaboration are creating concurrently, and we should embrace these possibilities into our curriculum to study our perceptions of the robots in the future world and how our behaviors might be impacted by these autonomous technologies. To prepare our students for future challenges, it is essential to create a simulated future world to study how A.I, Automation, and Robots (AAR) will interact with humans and form a new symbiotic relationship.
The faculty team will break down the process of the simulated future world as a two-stage process. The first stage is using virtual reality (VR) to develop an immersive digital environment populated with advanced AI and robots to simulate future living and working environments. We will model various humanoid robots (field robots, social robots), and humanoid police robots (UAV, industrial robots). The second stage of the process is to immerse students into these future world scenarios and test human reaction towards AAR through VR. In the second stage, the faculty team will work together to acquire IRB and create data collection plan with students from the three courses. The team has investigated this two-stage approach and will set up several shared seminars and presentations to promote student dialogues in Fall semester 2020.

Team and roles

Prof. Tang has taught virtual reality, robotics for digital fabrication, and applied A.I controlled agents for wayfinding study in DAAP1. Through his ARCH 4001, 4th-year Architecture design studio, he will lead the scenario design for the future working and living space and create VR environments for the other two courses.
Prof Cory Haberman uses quantitative methods to understand spatial-temporal crime patterns. He also uses mixed-methods research to advance evidence-based policing with particular interests in crime analysis, hot spots policing and focused deterrence. Through his CJ7010-001: Seminar in Criminal Justice Course, he will lead the students to evaluate the potential of criminal behavior in the future world impacted by policing robots, UAVs, and A.I controlled surveillance systems.
Prof. Tamara Lorenz exploring the dynamics of human interaction and their relevance and usability for Human-Robot Interaction (HRI). Her focus areas are rehabilitation engineering and HRI in manufacturing. She is also interested in exploring human-machine interaction in general and with applications to human factors in daily life and work surroundings. Through her PSYC 3043 – HUMAN PERFORMANCE course, she will lead the students to study human behavior towards robots in the future working and living environment.

Objective

The goal of the proposed project is to integrate three courses from the existing curriculums to promote interdisciplinary collaboration that proactively enhances the understanding of how A.I, Automation, and Robots (AAR) can impact human behavior. We hope to teach students both human behavior study methods and let them experience a possible future world through VR. Collectively, the team will investigate how human decisions may be influenced by either the robots, autonomous environment or both.
The following objectives will be achieved through designated coursework:
Objective 1: To understand the fundamentals of AAR technology and its applications in future scenarios.
Objective 2: To investigate the human perception and behavior towards the AAR through VR.

Objective 3: To understand the symbiosis of man and A.I in the new era through human robots interaction (HRI) study.

Courses

The collaborative courses will target both undergraduate and graduate students and encourage them to explore, discover, discuss, and construct research related to AAR. Major activities associated with teaching, research, and scenario development will collaborate with three colleges, A&S, CEAS, and DAAP. Three courses from the curriculums are calibrated to formulate a coherent investigation on the future world in alignment with the designated objectives.
• Course 1: ARC4001 – capstone design studio, future world design, and VR simulation. Offered by Ming Tang in Fall Semester 2020.
• Course 2: PSYC 3043 – HUMAN PERFORMANCE, Offered by Tamara Lorenz in Fall Semester 2020.
• Course 3: CJ7010-001: Seminar in Criminal Justice. Offered by Cory Haberman in Fall Semester 2020.

An interdisciplinary approach to using eye-tracking technology for design and behavioral analysis

Eye-tracking Tobii workshop offered at CGC, DAAP, UC. 2019.

Eye-tracking devices measure eye position and eye movement, allowing documentation of how environmental elements draw the attention of viewers. In recent years, there have been a number of significant breakthroughs in eye-tracking technology, with a focus on new hardware and software applications. Wearable eye-tracking glasses together with advances in video capture and virtual reality (VR) provide advanced tracking capability at greatly reduced prices.  Given these advances, eye trackers are becoming important research tools in the fields of visual systems, design, psychology, wayfinding, cognitive science, and marketing, among others. Currently, eye-tracking technologies are not readily available at UC, perhaps because of a lack of familiarity with this new technology and how can be used for teaching and research, or the perception of a steep learning curve for its application.

It has become clear to our UC faculty team that research and teaching will significantly benefit from utilizing these cutting-edge tools. It is also clear that a collective approach to acquiring the eye-tracking hardware and software, and training faculty on its application will ultimately result in greater faculty collaboration with its consequent benefits of interdisciplinary research and teaching.

The primary goals of the proposed project are to provide new tools for research and teaching that benefit from cutting-edge eye-tracking technologies involving interdisciplinary groups of UC faculty and students. The project will enhance the knowledge base among faculty and allow new perspectives on how to integrate eye-tracking technology. It will promote interdisciplinarity the broader UC communities.

Grant:

  • “Eye-tracking technology”. Provost Group / Interdisciplinary Award. $19,940. PI: Tang. Co-PI: Auffrey, Hildebrandt. UC. Faculty team: Adam Kiefer,  Michael A. Riley, Julian Wang, Jiaqi Ma, Juan Antonio Islas Munoz, Joori Suh. 2018
  • DAAP matching garnt  $ 11,000.

Hardware: Tobii Pro Glasses, Tobii Pro VR

Software: Tobii Pro Lab, Tobii VR analysis

Test with Tobii Pro Glasses

Hybrid Construction

A hybrid construction using Hololens AR model overlay with the physical structure. The second half of the video is captured through MS Hololens. However, due to the low visibility of the holographic image under sunlight, we are not able to use the AR model to guide installation. Research to be continued….

Installation. SAID, DAAP, University of Cincinnati
Base structure by 1st year SAID, students.
Add-on structure + Augmented Reality by ARCH3014 students.

GA: Robert Peebles, Lauren Meister, Damario Walker-Brown, Jordan Sauer, DanielAnderi. Faculty: Ming Tang

Video captured by 360 camera, MS Hololens, Fologram. Check out full installation image here.