Skip to main content

Integrated groundwater and surface water modeling: the appropriate way to prepare the best DEM

One the most important inputs for groundwater and surface water hydrology modeling is the Digital Elevation Model (DEM).

Though DEM can be obtained through many approaches, the final DEM varies significantly. Therefore, DEM can seldom  be directly used in most modeling work. Instead, we often need to adjust the DEM so that it can meet the requirements of the hydrology modeling.

The adjustment, however, often involves series of operations. These operations can be carried out using different approaches in different orders. The result could be quite different. The important part is that most of them are unusable. So the question is, what is most promising way to prepare the DEM?

Here I have provided an example work flow, which may be suitable for most tasks. Then I will explain in details the purpose of this step and what tools we can make use of. In the end I will discuss why it should be done in this order.
  1. Download the DEM datasets of the same format for the study area;
  2. Mosaic these DEM datasets into one raster file;
  3. Resample the DEM if necessary;
  4. Project the DEM into required projection;
  5. Clip the DEM using  a rectangle, whose spatial extent must cover the whole study area; 
  6. Download the stream datasets of the study area;
  7. Process the stream datasets similar to DEM;
  8. Burn the flow line into the DEM;
  9. Fill the depression within the DEM;
  10. Correct the DEM in flow line.

Explanations:

  1. DEM datasets can be downloaded from USGS NED in various formats and resolutions. Make sure the DEM datasets covers the spatial extent of the study area;
  2. Depending on the spatial extent of the study area, it usually contains more than one DEM data file. Therefore, it is necessary to mosaic them into one file. This is because most GIS operations such as re-sample are sensitive to edge effects. 
  3. The resample operation could bring the DEM into desired spatial resolution. It is common to re-sample from high resolution to low resolution;
  4. The DEM file is probably not in the desired projection under the study area. 
  5. Computational demand is something we try to avoid if possible. Therefore it is best to extract the DEM out from the larger file. However, we must keep the valuable information. So we can clip the DEM using a large rectangle. 
  6. Stream data are necessary to assist the DEM reconditioning;
  7. The stream data also needs to prepared into the same spatial extent and projection;
  8. Adjust the DEM so surface water can always flow into the streams. However, we don't want to change the DEM so much because the groundwater system also interacts with the stream.
  9. Surface depression causes a lot of troubles for surface runoff. For most surface hydrology, they can be removed through the fill operation.
  10. There might be still flaws within the DEM along the flow line, so we need to correct them separately.

Most of the GIS operations can be carried out within ArcGIS. However, other tools such as ArcHydro, SAGA are also very powerful. 

Next, for most cases, these operations must be carried out in the above order. Step 3 and 4 can change order without too many differences. For DEM reconditioning, it is usually good practice to follow steps 8, 9 and 10 exactly. You will find that depressions still exist if you fill them in the first place since burn can change DEM. There is currently no tools can consider stream and depression at the same time. Double check the DEM afterwards is encouraged.










Comments

Popular posts from this blog

Spatial datasets operations: mask raster using region of interest

Climate change related studies usually involve spatial datasets extraction from a larger domain.
In this article, I will briefly discuss some potential issues and solutions.

In the most common scenario, we need to extract a raster file using a polygon based shapefile. And I will focus as an example.

In a typical desktop application such as ArcMap or ENVI, this is usually done with a tool called clip or extract using mask or ROI.

Before any analysis can be done, it is the best practice to project all datasets into the same projection.

If you are lucky enough, you may find that the polygon you will use actually matches up with the raster grid perfectly. But it rarely happens unless you created the shapefile using "fishnet" or other approaches.

What if luck is not with you? The algorithm within these tool usually will make the best estimate of the value based on the location. The nearest re-sample, but not limited to, will be used to calculate the value. But what about the outp…

Numerical simulation: ode/pde solver and spin-up

For Earth Science model development, I inevitably have to deal with ODE and PDE equations. I also have come across some discussion related to this topic, i.e.,

https://www.researchgate.net/post/What_does_one_mean_by_Model_Spin_Up_Time

In an attempt to answer this question, as well as redefine the problem I am dealing with, I decided to organize some materials to illustrate our current state on this topic.

Models are essentially equations. In Earth Science, these equations are usually ODE or PDE. So I want to discuss this from a mathematical perspective.

Ideally, we want to solve these ODE/PDE with initial condition (IC) and boundary condition (BC) using various numerical methods.
https://en.wikipedia.org/wiki/Initial_value_problem
https://en.wikipedia.org/wiki/Boundary_value_problem

Because of the nature of geology, everything is similar to its neighbors. So we can construct a system of equations which may have multiple equation for each single grid cell. Now we have an array of equation…

Lessons I have learnt during E3SM development

I have been involved with the E3SM development since I joined PNNL as a postdoc. Over the course of time, I have learnt a lot from the E3SM model. I also found many issues within the model, which reflects lots of similar struggles in the lifespan of software engineering.

Here I list a few major ones that we all dislike but they are around in almost every project we have worked on.

Excessive usage of existing framework even it is not meant to Working in a large project means that you should NOT re-invent the wheels if they are already there. But more often, developers tend to use existing data types and functions even when they were not designed to do so. The reason is simple: it is easier to use existing ones than to create new ones. For example, in E3SM, there was not a data type to transfer data between river and land. Instead, developers use the data type designed for atmosphere and land to do the job. While it is ok to do so, it added unnecessary confusion for future development a…