VR for Police Training

Develop several fully immersive 3D VR active shooter scenarios that can run on cost-effective commercially available VR hardware.

Develop and Assess Active Shooter Virtual Reality Training for Ohio Law Enforcement.  PI: J.C Barnes. Co-PI: Tang,  Latessa, Haberman. Office of Criminal Justice Services. $50,000. 09. 2020-09.2021 ( $29,608)

Using Checklists and Virtual Reality to Improve Police Investigations. Collaborative Research Advancement Grants. UC. $25,000. PI: Haberman. Co-PI: Tang, Barnes. Period: 07.2020-01.2022.

Development of a Virtual Reality Augmented Violence Reduction Training System for Active and Mass Shooting incidents. PI: Ed Latessa. Co-PIs: J.C. Barnes, Ming Tang, Cory Haberman, Dan Gerard, Tim Sabransky. $10,000. Start-up fund. UC Digital Futures anchor tenant cohort.

 

Protected: Computer Vision of Building Roof Analysis

This content is password protected. To view it please enter your password below:

Open source shading device design tool

Developed by Mark Landis as a part of M. Arch thesis work. 2019. The University of Cincinnati.

Thesis Chair: Ming Tang, RA, LEED AP

Committee Member: Pravin Bhiwapurkar, Amanda Webb.

This open-source free tool is to introduce you to building simulations and some of their applications in architecture through plugins for Rhino. It will help designers create shading devices that save the building as much energy as possible while granting the designer enough freedom to determine what form the shades take. The idea is to ensure a shade will functionally perform while allowing the designer to prioritize other aspects of shading device that are important to them and their project, such as how the shades will look, how they form a system on the building, how they are built, how much they cost, and more. The tool makes use of Energyplus for energy simulations of space and Radiance and Daysim for solar gain and solar position calculations.

Users can download and utilize the simplified tool provided to create and document shading device iterations that all perform in a similar manner. Each iteration should represent different design goals, aesthetics, material choices, etc… and be justified through a thoughtful narrative.

 

Credit: projects from ARCH3014. Spring semester. 2019. DAAP.UC

Prerequisite:

Watch Tutorial:  Performance-driven design Step 1

Tools need to be download and install:

The ghpython2.gha file need to be unlocked and put into “special folder/ components folder”

Making your own shading device and create an energy model

Watch Tutorial:  Performance driven design Step 2

Install: All source files and weather data are here. Please choose the right file based on your Rhino version.

The following text note Steps 1-15 are already covered in the video tutorial Step 2 above.

Step 1 – Install the required plugins and programs referenced above or mentioned in the videos provided. And move your EPW file provided to either your documents or downloads (not in a folder inside these areas), or simply C:/ladybug/

Step 2 – Open the Rhino file provided. Do not change any of the layer names or delete anything on the “Baseline” Model layer.

Step 3 – Open the provided Grasshopper script. If you get a message about any components not being recognized take note of which plugin they are associated with and attempt to reinstall this plugin. Upon successfully opening the script you will be prompted to open an EPW file. Select the EPW provided to you.

Step 4 – You should see a dialog box called “shade Designer” with 5 steps under tabs and a debug tab. You do not need to interact with the grasshopper script from this point on. Instructions are effectively stated in the tool, but are also reiterated here as well.

Step 5 – Run the baseline model simulation by toggling the switch in the step 1 tab. You should see a command prompt with scrolling numbers appear briefly. The process should take less than a minute. If this does not occur make sure you installed open studio in your C drive as instructed in the tutorial videos provided and ensure you are using the EPW file provided without changing the file name.

Step 6 – Advance to the step 2 tab on the dialog box. While drawing on the “set profile to shade” layer draw one closed, planar, non-curvilinear, polyline representing the area you want to shade. This should include the whole window but could be taken a number of different ways. You could draw an outline around the whole window and shade the whole window at once or you could draw a shape that could be tessellated across the whole window such as a smaller square that can be copied to shade the whole window. Every unique shape must be shaded separately. Only run one polyline at a time. If you intend on shading the window using a pattern involving more than one shape return to this step after solving each portion of the window. Once you have drawn your profile, toggle the switch in the step 2 tab. Be patient, this will take 30 second to a minute to update. For more clarification or to see this process in work see the provided tutorial video.

Step 7 – Once your computer has updated after step 6 you should see the preview of a red form attached to the profile you drew. This is going to be referred to as the shading volume. This volume represents the volume of sunlight that the design tool has determined should not be allowed into the space for the purpose of saving energy. If you do not see this form try incrementally changing the division numbers in the debug tab. If this still does not work after a number of attempts it may be that your profile to shade is too complex, non-planar, curvilinear, not closed, or there are more than one curve. Check to make sure these are not the case and as a last resort make your profile simpler. Toggle on and off the option to bake the shade volume, on the step 3 tab, to make future steps simpler.

Step 8 – Use the shade volume you baked to create a shading device that blocks the entire shade volume from the window. Any form that completely or mostly cuts this volume off will perform similarly to any other iteration allow you to have freedom to design. You could use the shade volume you baked to trim an extrusion that goes up and over the window leaving just the intersection to block negative solar gains or you could draw your own interpretation that follows closely the form of the shade volume but is not literally trimmed from it. Below is a series of designs that are all doing the same thing without looking the same. If your profile to shade was a shape meant to tessellate to cover the whole window you can design a trimmed extrusion for just that shape and then copy the shading device to cover the window. If there are any other unique shapes in the screen return to the set profile step and repeat until your whole window has been covered. See the tutorial videos for additional clarification.

Step 9 – Set the shading device or devices to the “shading devices” layer and toggle the option the set them in the energy model and then toggle the switch to run the second simulation. You will see the dialog box appear again and the program will gather results for the baseline model with the addition of your shading devices.

Step 10 – You should now see results in the “step 5” tab. Record these numbers in the provided Excel sheet. The Excel sheet will give you the numbers you need for your assignment you will turn in and help me gather data on all of your projects more easily.

 Step 11 – Once you have your basic design we are asking you to think of your shade in a realistic context. Add elements, structural, material, or otherwise to make your design more architecturally feasible. Think about what the shade is made of, is it wood, metal, concrete, plastic? How big are the wood pieces, metal panels, thickness of the structure? What type of wood, what color finish, texture? Is it spanning too far to exist without supports? What does the structure look like? Model these elements and materials to take your paper thin design and give it some reality. Set materials in Rhino and model elements with appropriate thicknesses. Create your own layers for these designs apart from the ones provided as to not interfere with any future simulations.

Step 12 – Set your view to “render view” in the named views provided (see image below). Render this view. The presets are already established. If for some reason your design gets cutoff in this view you may adjust the camera to make it fit. Save this image as a PNG.

Step 13 – Use the make2D command on your design and baseline model from the render view and export the line work generated as a DWG.

Step 14 – Open the illustrator file provided replace the images and text with your own. Go to “File” and then “Place” to insert your PNG and DWG the PNG should be imported at 100% of original size and the bounding box should align with the left side and touch the top and bottom edges. The DWG should be overtop of the image, outlining your design, and imported at 39.61% to match the exact size of the image. Change the line weights of all the lines in the DWG to 2pt. Use the numbers in the Excel file to add performance data in the bottom right side of each sheet. Write a brief narrative for each design explaining what it is and why you designed it the way you did.

Step 15 – Repeat steps 6 to 14 so that you have a total of 3 unique iterations. You can use this opportunity to test different patterns, shading types, material qualities, to play with different scenarios concerning building program (should a shade be different in form if it was for an office vs a store vs an apartment?), or any number of other conditions.

Credit: projects from ARCH3014. Spring semester. 2019. DAAP.UC

Additional Resources

QA, Trouble Shoot

Q: Why my HB simulation is not running? ( component is red color in GH)?

A: Check your weather data EPW file. Move it to C:/ladybug folder.  Relaunch Rhino and GH, then point to the new location.

check your HB and LB files in “special folder / user object folder”, make sure they are “unlocked”.

Check your ghpython2.gha. This file needs to be unlocked and put into “special folder/ components folder”

Q: There are some strange horizontal projects in my 3D mask? 

A: If you are using Rhino 5, open the Rhino 6 version of Grasshopper. 

article in IJSW journal

Ming Tang’s paper. Analysis of Signage using Eye-Tracking Technology is published at the  Interdisciplinary Journal of Signage and Wayfinding. 02. 2020.

Abstract

Signs, in all their forms and manifestations, provide visual communication for wayfinding, commerce, and public dialogue and expression. Yet, how effectively a sign communicates and ultimately elicits a desired reaction begins with how well it attracts the visual attention of prospective viewers. This is especially the case for complex visual environments, both outside and inside of buildings. This paper presents the results of an exploratory research design to assess the use of eye-tracking (ET) technology to explore how placement and context affect the capture of visual attention. Specifically, this research explores the use of ET hardware and software in real-world contexts to analyze how visual attention is impacted by location and proximity to geometric edges, as well as elements of contrast, intensity against context, and facial features. Researchers also used data visualization and interpretation tools in augmented reality environments to anticipate human responses to alternative placement and design. Results show that ET methods, supported by the screen-based and wearable eye-tracking technologies, can provide results that are consistent with previous research of signage performance using static images in terms of cognitive load and legibility, and ET technologies offer an advanced dynamic tool for the design and placement of signage.

Issue

ACKNOWLEDGMENT
The research project is supported by the Strategic Collaborative/Interdisciplinary Award of the University of Cincinnati. Thanks to the support from Professor Christopher Auffrey, students from ARCH7014, Fall 2019 semester, ARCH8001 Spring 2019 semester, and ARCH4001, Fall 2018 semester at the University of Cincinnati.

For more information on the wearable ET, screen-based ET, and VR-ET, please check out our research website, or contact Prof. Tang.

 

publication in Urban Rail Transit journal

Paper published in the Urban Rail Transit journal

This paper describes an innovative integration of eye-tracking (ET) with virtual reality (VR), and details the application of these combined technologies for the adaptive reuse redesign of the Wudaokou rail station in Beijing. The objective of the research is to develop a hybrid approach, combining ET and VR technologies, as part of an experimental study of how to improve wayfinding and pedestrian movement in crowded environments such as those found in urban subway stations during peak hours. Using ET analysis, design features such as edges, and color contrast are used to evaluate several proposed rail station redesigns. Through VR and screen-based ET, visual attention and related spatial responses are tracked and analyzed for the selected redesign elements. This paper assesses the potential benefits of using ET and VR to assist identification of station design elements that will improve wayfinding and pedestrian movement, and describes how the combination of VR and ET can influence the design process. The research concludes that the combination of VR and ET offers unique advantages for modeling how the design of rail transit hub interiors can influence the visual attention and movement behavior of those using the redesigned station.  This is especially true for crowded conditions in complex interior spaces. The use of integrated ET and VR technology is shown to inform innovative design approaches for facilitating improved wayfinding and pedestrian movement within redesigned rail stations.

Full paper: download PDF, read HTML

Check out Tang’s eye-tracking research with transit hub design studio ARCH4002, Spring 2018.