Monday, July 13, 2015

Week 8 - AppsGIS - Damage Assessment

This week we looked at damage assessment, using information for Hurricane Sandy. We started by mapping the path of the hurricane, showing the category of the storm, barometric pressure, and wind speed at different points. The map shows the countries, as well as states within the United States, impacted by the storm.

We then "prepared" the data by adding in aerial photos of before and after the storm, so that the images could be compared. We did this using the flicker and slider tools in the Effects Toolbar. We also used Editing sessions to add attribute data, using attribute domains and changing their properties to utilize both codes and descriptions (i.e. for Structure Type: 1 - Residential, 2 - Government, 3 - Industrial, 4 - Unknown). 

A major part of our analysis was creating new damage data for the imagery. An Edit session was started for a newly created feature class (with the domains created beforehand). Using the pre-storm imagery, buildings were selected, and then their domains set to describe the building. Categories were structure damage, wind damage, inundation and structure type. In order to perform this, used the "create features" window to select the point option, and a point was placed on a building within our study area. Then clicked on the attribute tab so that the attributes could be adjusted according to the evaluation. The Swipe tool was used to compare the before and after.

Digitized all of the buildings in the study area this way, changing the attributes after placement to reflect the post-storm imagery. The resulting points were categorized based off of structure damage level:



 I chose to fill in all of the values for each point, not just structure damage level.

Afterwards, we used a new polyline feature class to show the location of the coastline prior to the storm. A simple straight line was used because we wanted to show the number of buildings affected (and how) in relation to the coastline (i.e. how many buildings within 100 meters of the coast have major damage?). In order to determine the number of buildings within each distance, used the Select by Location tool to select the parcels within the distance. For 0-100, did a simple “select features from” selection, with the Coastline layer as the source layer, and the selection method of “are within a distance of the source layer”. For both the 100-200 and 200-300, did a select features from, with the maximum distance, followed by a “removed from the selection” for values less than the intended range. Final results were exported to a new layer. To determine the number of buildings at each structure damage level, used the Select by Attributes tool within each (new) layer’s attribute table. This was done in order to be able to look at patterns of damage seen.
 

After we did the above, we combined the spatial assessment data with the parcel data, so that we could match the attributes within the table. This was accomplished using the Attribute Transfer tool. This was done for approximately 1/3 of the buildings within the study area.

I thoroughly enjoyed this week's exercise and learning how damage assessments are formed. It helped make the limitations of aerial imagery more apparent, such as when it is hard to tell if damage is from the wind or not. It will definitely come in handy the next time there is a hurricane here!

Saturday, July 4, 2015

Week 7 - AppsGIS - Coastal Flooding

This week we worked with Coastal Flooding from sea level rise and from storm surge. For the sea level rise, we looked at two scenarios, one with a 3 foot rise, and one with a 6 foot rise. We then mapped the results and how it relates to population density. Later, for storm surge analysis, we compared two different DEM models.

In order to do look at the impact of sea level rise, we started with a DEM raster of the area, which we then wanted to extract only those cells that would be impacted by the associated rise in sea level. To this, the Reclassify tool was used, and the data was reclassified so that only the values of interest (to 3 feet or 6 feet) were included, all others were changed to "NoData". The resulting attribute table was looked at to determine the number of cells that are flooded (value of 1) and not flooded (value of 0). 
We then looked at the Properties of the Layer in order to determine that the cells of the raster are 9 m2. This was multiplied by the number of cells within each floodzone to determine the area.

To analyze the depth of flooding, I used the results from the Reclassify Tool as the input in the Times tool, in order to create a new raster of only the floodzone elevations. In order to get the flooded depth, used the Minus Tool, with either the equivalent of either 3 feet or 6 feet, and the results from the Times Tool as the second input. These results were then mapped against the population density of the census tracts (this map is only for a rise of 6 feet):
Figure 1. Map of District of Honolulu showing impact of 6 foot sea level rise, and
includes population densities of the region.

We then wanted to look at the social vulnerability of the are, with data from the 2010 Census for the area. In order to do this, we first had to determine which census blocks were located within the floodzones. For our analysis, we chose to select those whose centroid was located in the floodzone, using the Select By Location tool. Looking at the Attribute Table, we were able to determine how many blocks were selected, and what population this affects. This can be done simply looking at the Statistics for the column of choice, as only the selected rows are summarized.

Next, we had to add fields in the Census tracts layer for each of the groups of interest, percent white residents, percent owner-occupied homes and percent homes with people over the age of 65. Table joins were then conducted in order to copy over the information. Each join was removed prior to joining another. The Field Calculator was used to fill in the data. 

This was then repeated for the Census Blocks layer, with additional fields added for the population of white residents, owner-occupied homes, and those over the age of 65. For our analysis, the census blocks did not include the make-up of each block, so the census tract data was used. For this, it was assumed that the population composition for the whole was equal for the part. A Table Join was created so that the percentage could be copied over. This percentage was then used to determine the size of the population for each of the three social statuses. Used the Select by Location tool to select the blocks that had their centroid located within the floodzone, as above, for both 3 and 6 feet. Then used the Statistics function to get the sum for each population in order to fill out the table in Deliverable 7. To get the values for the nonflooded areas, simply switched the selection. Did this for all of the variables in the table. Then divided the populations for each variable by the total population for that category (3 feet flooded, 3 feet not-flooded, etc).

The results were as follows:

Variable
Entire District
3 Feet Scenario
6 Feet Scenario


Flooded
Not-flooded
Flooded
Not-flooded
Total Population
1,360,301
8,544
1,351,757
60,005
1,300,296
% White
24.7 %
36.8 %
24.7 %
29.6 %
24.5 %
% Owner-occupied
58.2 %
32.2 %
58.3 %
38.1 %
59.1 %
% 65 and older
14.3 %
17.11 %
14.3 %
17.0 %
14.2 %
Table 1. Percent of population represented by each group in flooded areas compared to not-flooded areas. Results are for 3 and 6 feet of sea level rise.
After this analysis, we looked at storm surge in Collier County, Florida. The purpose for this analysis was to compare the results of two different DEMs, one by USGS created using older methods (and lower resolution), and one created using Lidar. In order to compare the two, percent error in omission (not including data that should be) and percent error in commission (false positives). For our analysis, the Lidar data was treated as accurate for the calculations. The results were quite different, and showed the major benefits of Lidar techniques, with most errors of commission being above 100%.

Being a resident of Florida, this week's analysis was quite intriguing and interesting to do. I especially liked being able to look at how seeming small levels of sea level rise can affect such large areas. I certainly used the NOAA Sea Level Viewer to see if my house was in danger, especially since our yard floods quite a bit in heavy rains. I expect to use this new knowledge quite a bit in my future.