Project 2 - Sierra Costera Site Evaluation (Part 1)

ENV 761 - Landscape GIS   |   Spring 2024   |   Instructors: Peter Cada & John Fay  |  

Overview

Digital Elevation Model (DEM) data are widely available and can be used to derive a variety of ecologically meaningful spatial layers. Here we explore how to use DEM data to derive both hydrologic features (streams, watershed boundaries) and terrain-based indices that are often useful in identifying ecologically distinct areas within a landscape.

Use the scenario below to familiarize yourself with how these metrics are calculated in ArcGIS. I’ve provided some tools that allow you to overcome some of the more tedious components of certain analyses; feel free to keep copies of these tools and use them beyond this class. However, it’s essential that you have at least a conceptual idea how these tools work, so be sure to ask if you need further explanation about any process used in this exercise.

The data for this lab are found in the Project2Data.zip file.

Learning Objectives

  • Construct an organized and distributable workspace for geospatial analysis
  • Execute basic surface analysis tools: slope, aspect, hillshade, and analytical hillshade
  • Execute the hydrologic analysis workflow: fill, flow direction, flow accumulation, and stream delineation
  • Convert your DEM derived stream raster to vector features, labeling each stream by its stream order
  • Compute upstream areas using various pour point snapping techniques, understanding the assumptions and limitation of each technique

Recordings


Scenario

WWF-Mexico is preparing for a workshop reviewing conservation priorities in the Sierra Costera de Oaxaca – the coastal mountains on the southern coast of Oaxaca, Mexico. At present, data for the region are limited and they only have a 3 arc-second (90m) DEM for the entire study area along with a 1/2 arc-second DEM (15m) for a selected portion of the area. Still, they need you to help with two pressing requests.

  • First, a number of workshop participants will be bringing species observation data with them (GPS coordinates of where they searched for individuals of various important species). Many are interested in determining whether distance from stream is a good predictor for species presences -knowledge that will be important in defining conservation priorities within the region. In order to calculate this, however, they need a stream map for remote locations where they have only DEM data. Your first task then is to create this stream map for WWF.

  • The second request involves determining the drainage area for five stream gauge locations. Each gauge site is a known location of a rare and endangered amphibian, and WWF is interested in protecting at least one of these drainage areas. Which drainage area they chose depends on several factors and will be a focus of the upcoming workshop, but they first need you to determine some general topographic characteristics of each drainage area. So, your second task is to generate maps of a set of topographic indicators for these drainage areas as well as tabulate a set of summary statistics to facilitate comparison of these sites.

Specific instructions regarding the deliverables will be provided in a separate document, but you are to begin your analysis using the guidelines below.

Let’s get started!


Part 1 - Data Prep & Initial Hydro Analysis

1. Create the project workspace template

As usual, the first step in most any analysis is to create a workspace for your project. Start this project as we’ve learned previously by creating a project folder with the usual subfolders within it (“Data”, “Docs”, “Scratch”). This time, however, also create a “Scripts” folder as we’ll be using some Python scripts in this exercise. Create an ArcGIS Pro project named SierraCostera, in this folder and finally set the workspace environment variables, setting the current workspace to the project geodatabase and the scratch workspace to the scratch geodatabase.

2. Add and prepare provided data

► Unzip the Project2Data.zip file to your V: drive.

The resulting folder contains a geodatabase, two toolboxes, some Python scripts, a set of LYR files, and “TarDEM” – a folder containing executable files developed by David Tarboton at Utah State that provide some hydrologic analyses that improve upon ArcGIS’s raster tools. You need to sort through these files and place them in the correct location in your workspace.

► Prepare these datasets for analysis:

  • :point_right: Unzip the SierraCostera_UTM14.gdb geodatabase into your project’s Data folder.

    This geodatabase contains the following datasets, each referenced to the UTM Zone 14 (NAD 83) coordinate system.

    • DEM_90m – A 90m DEM provided by Mexico’s National Institute of Statistics and Geography (INEGI). The original “.bil” file was subset to the Sierra Costera study region and reprojected to the UTM 14 (NAD 83) coordinate system using bilinear resampling, keeping the original 90m cell size.
    • DEM_15mf – A 15m DEM also provided by INEGI, converted and reprojected to the UTM. This dataset, however, was subset to a smaller region. It has also been “filled” (discussed later in this exercise).
    • GaugeLocations – A point feature class representing the 5 stream gauge locations for which upslope analysis will be conduct.

    :point_right: Add all these spatial datasets to a new (or existing) map in your ArcGIS project and rename/symbolize them, if desired.

  • The files with a .py extension are Python scripts. We don’t go into writing Python scripts in this course, but we will use them. These Python scripts are linked to tools in the ENV761_TerrainTools.tbx. For them to work, they need to be stored in the Scripts folder.

    :point_right: Move the Python script files to the Scripts folder.

  • Lastly, the folder called LayerTemplates includes some ArcMap LYR files. The Acervo de informacion geografica INEGI (Mapa Digital de Mexico) on gaia.inegi.org.mx.wms file links to a web mapping service containing several base layers. The other LYR files do not point to specific datasets but rather contain symbology which can be applied to layers created in this lab.

    :point_right: Move the LayerTemplates folder into your Data folder.

    → When these steps are complete you are ready to proceed with the analysis and your workspace should resemble this:
      


3. Surface analysis

We’ll start simply by deriving a few datasets from our DEM using single tools provided with ArcGIS to analyze elevation data: slope, aspect, hillshaded relief, and a modified hillshade dataset, called “analytical hillshade” which approximates the amount of warmth received by the sun.

  • Slope measures the change in elevation measured between a cell and its neighbors. Units can either be in degrees or percent rise.
  • Aspect measures the compass direction of a cell’s slope. Values range from 0° to 360°, with 0° facing north, 90° facing east, etc. Flat cells are assigned a value of -1.
  • Shaded relief or Hillshade calculates the hypothetical illumination of a surface as determined by cell aspect and a light source at a specified angle (elevation) and azimuth (compass direction). Output values range from 0 (receiving no light from the light source) to 255 (a fully illuminated cell).
  • Analytical hillshade is a modification of the default hillshade. By setting the sun angle to 30° and azimuth to a southwesterly 225°, we get an estimate of the amount of solar radiation (or warmth) a cell in our DEM dataset might receive.

We will now create a new model in our SierraCostera toolbox to calculate these datasets using our DEM.

  1. Add a new model to your SierraCostera toolbox.

    • Rename it something like Surface Tools.
    • Set all outputs to be saved in your default geodatabase (“SierraCostera.gdb”).
  2. Add the Slope tool to your model

    • Set it to use the 90m DEM as the input and keep slope units in Degrees.
    • Ensure the method used id Geodesic.
    • Save the result as Slope_90m in your project geodatabase.
    • Run the tool and examine the output.

    → Geodesic or planar? If you examine the help note associated with the choice of using planar or geodesic distances, you’ll find that our extent falls a bit in the gray area between the two. You could very well argue that planar is the best answer. However, we’ll use geodesic in our choices just to be consistent.

  3. Add the Aspect and two instances of the Hillshade tools to your model.

    • Set each to use the 90m DEM as the input and Geodesic as the method, if applicable.
    • Save as Aspect_90m, Hillshade_90m, and Insolation_90m in your project geodatabase, respectively.
    • For the second Hillshade tool, set the Azimuth to 225° and Altitude to 30° and check Model Shadows.

    This second hillshade is a proxy for insolation, i.e. the amount of sunshine the location would get during the warmest part of the day (mid afternoon). In North America, we generalize the value to be when the sun shines from the southwest and is about 30° above the horizon. To get more precise values for the suns azimuth and angle for particular times and location you can use this calculator:
    https://www.suncalc.org/#/15.8768,-97.2949,3/2018.09.21/15:17/1/0

  4. Run your model and add the results to your map.

    • Examine each of the resulting raster datasets.

    • Note their values and how they might be best symbolized (color scheme, transparency, resampling type, …)

      Be sure you understand why some have attribute tables and others do not.


4. Hydrologic analysis

DEM data are also very useful for deriving hydrologic datasets such as stream courses and watershed boundaries. ArcGIS provides a set of tools useful for these analyses (found in the Hydrologic toolbox within the Spatial Analyst tools), but the sequence of tools is important.

To process our DEM data into hydrologic rasters, we'll build a new geoprocessing model to hold our sequence of geoprocessing tools. For each of the following steps, be sure you understand what's going on and what the resulting cell values are for each derived raster data sets.

4.1 Deriving a stream network from a DEM

  1. Add a new model in the SierraCostera toolbox Name it something like CreateStreamNetwork

  2. Add the Fill tool to your model.

    • Input = DEM_90m
    • Output raster = FilledDEM (in scratch geodatabase)

    The fill tool removes any sinks from the original DEM surface. Sinks, which are simply cells that are lower than all their immediate neighbors, cause havoc when trying to model downstream flow. While some sinks are indeed natural, most are likely artifacts of the DEM creation technique.

  3. Add the Flow Direction tool to your model.

    • Input = FilledDEM
    • Output raster = FlowDir_D8 (in scratch geodatabase)
    • Flow direction type = D8
    • [Leave Output Drop raster blank]

    :point_right:Be sure your flow direction output only has 8 distinct values (1 ,2, 4, 8, 16, 32, and 128). If not, then you are not using a filled DEM or some other error has occurred.

  4. Add the Flow Accumulation tool to your model.

    • Input = FlowDir
    • Output = FlowAcc (in scratch geodatabase)
    • Output data type = Integer (to speed processing and save disk space)

    → Save your map and run the model. Examine the Flow Accumulation results: do you understand what each cell's value represents?

  5. Threshold flow accumulation into a stream network using the Set Null tool.

    1. Input raster = FlowAcc
    2. False raster or constant value = 1
    3. Output raster = StreamGrid (in scratch geodatabase)
    4. Expression = VALUE is less than 300

    This procedure sets any cell with fewer than 300 cells draining through it to NoData; the remaining cells are considered stream cells. The choice of 300 is chosen somewhat arbitrarily; Would a higher or lower number produce a map of fewer, larger streams (more cells set to NoData)? Which would label more cells as stream cells. A common practice is to compare the results to an existing stream layer with a known resolution (e.g. 1:24,000 USGS Topo Quads).

    Since there are no USGS Topo Quads for Oaxaca, this method won’t work. Instead, we can compare our results to INEGI data provided in the map service layer. Add the layer (Located in the LayerTemplates folder), then zoom your map to a scale of 1:50,000. Compare your stream features to those of INEGI when zoomed to this scale. Are they in roughly the same location? Is the level of detail consistent? If so, you can use that to justify your choice of 300 as a cutoff value for flow accumulation to produce a reasonable stream data set

  6. Add the INEGI stream layer to your map and compare your streams to INEGI’s.

    • In the Catalog pane of your project, locate the Acervo de informacion geografica INEGI (Mapa Digital de Mexico) on gaia.inegi.org.mx.wms object. Right click it and select “Add to Project.”
    • The entry should now appear in the Servers section in your Catalog pane. Double click it to reveal what data services are available through this connection. Locate and add ‘Corrientes de Aqua ‘ to your map:
      • Servico WMS>Servico WMS>T50m>Corrientes de Aqua
    • View your derived streams to those of INEGI. Do you have noticeably more or fewer streams?
  7. Calculate stream order from the Streams raster using the Stream Order tool.

    • Input stream raster = StreamGrid
    • Input flow direction raster = FlowDir
    • Output raster = StrmOrder (in scratch geodatabase)
    • Method = Strahler
  8. Convert stream order raster to a vector stream dataset with the Streams to Feature tool.

    • Input = StrmOrder
    • Flow direction = FlowDir
    • Output = Streams (in project geodatabase)
    • Un-check Simplify polylines

    The result can be symbolized using graduated symbols on the GRID_CODE attribute, which represents stream order, to create a nifty map of the stream network.

We now have a vector stream network dataset completely derived from the DEM. It’s not as accurate as a map digitized from aerial photos or recorded by wading through the stream with a GPS, but it’s a useful dataset derived solely from DEM data – and since DEM data are available for nearly all terrestrial regions of the globe, it’s a technique we can apply quickly and easily for virtually any location!


Summary - Part 1

At this stage in our analysis, we’ve constructed a simple hydrologic model of our landscape and used that model to derive a dataset of contributing areas for each of our gage locations. In the next part of this lab, we’ll continue to derive new datasets from our digital elevation data, specifically we’ll compute a number of datasets describing the terrain in various formats.