Skip to main content

Ecosystem modeling: uncertainty quantification

I recently read a few news articles of skeptical attitudes towards climate change, and I decided to write something about it. I am not trying to promote climate change, but instead I want to point out there is great uncertainty in our current Earth system modeling.

Uncertainty quantification (UQ) is an important step in most numerical simulation processes. In general, a simulation without uncertainty quantification is less convincing. (I was surprised that an invited speaker in department seminar said he doesn't care about uncertainty.)

However, how to conduct uncertainty quantification itself is a challenge, especially for highly nonlinear ecosystem modeling.

First, I will invite you to read the Wikipedia UQ here:
which serves as an overview of the concept I will discuss below.

Uncertainty comes from lots of sources and nearly every step we take in our Earth system modeling has uncertainty. We, scientists as well as engineers spent great efforts to reduce the uncertainty. That is why climate modelers always try to improving our estimates year by year. As a result. you may read papers like "something is potentially underestimated previously". (That is also why you see 7-day weather broadcasting is always changing.)

We don't know what we don't know. That is why we may never be able to consider everything, which is reasonable for scientific research. Decades ago, we didn't consider well the groundwater and surface water interactions in most hydrology modeling. Today, we still do NOT always consider!

Another big issue in Earth system modeling is that we are generally reluctant to spend too much efforts in UQ. Peer reviewers are not always interested in your UQ. However, they must be interested in your scientific questions and results. In an environment that publication is the major measure of your academic accomplishments, the quality of science is likely weakened.

Unlike experimental science, Earth system modeling is very much like a black box. Indeed you can see the source code of the model, but you can seldom reproduce the whole research since there are lots of pre-processes and post-processes involved but yet unveiled.

Assumptions. We made too many assumptions. There are many reasons we have to. One of the reason is the computational demand.

We always want more data. But the reason we need to simulate is we don't have enough data and there is also some data cannot be measured directly. Even we measured data, they are not true values in most cases.

If you look at the climate change prediction again, you can tell where the uncertainty comes from better.

  1. We don't have high quality input data;
  2. We made assumptions in lots of processes;
  3. We ignored lots of processes;
  4. We don't know lots of processes we don't know yet;
  5. We made mistakes or even misconducts.

However, that is also why we are working hard to reduce the uncertainty.  A few years back, the weather broadcast was as good as today's? I doubt.


Popular posts from this blog

Spatial datasets operations: mask raster using region of interest

Climate change related studies usually involve spatial datasets extraction from a larger domain.
In this article, I will briefly discuss some potential issues and solutions.

In the most common scenario, we need to extract a raster file using a polygon based shapefile. And I will focus as an example.

In a typical desktop application such as ArcMap or ENVI, this is usually done with a tool called clip or extract using mask or ROI.

Before any analysis can be done, it is the best practice to project all datasets into the same projection.

If you are lucky enough, you may find that the polygon you will use actually matches up with the raster grid perfectly. But it rarely happens unless you created the shapefile using "fishnet" or other approaches.

What if luck is not with you? The algorithm within these tool usually will make the best estimate of the value based on the location. The nearest re-sample, but not limited to, will be used to calculate the value. But what about the outp…

Numerical simulation: ode/pde solver and spin-up

For Earth Science model development, I inevitably have to deal with ODE and PDE equations. I also have come across some discussion related to this topic, i.e.,

In an attempt to answer this question, as well as redefine the problem I am dealing with, I decided to organize some materials to illustrate our current state on this topic.

Models are essentially equations. In Earth Science, these equations are usually ODE or PDE. So I want to discuss this from a mathematical perspective.

Ideally, we want to solve these ODE/PDE with initial condition (IC) and boundary condition (BC) using various numerical methods.

Because of the nature of geology, everything is similar to its neighbors. So we can construct a system of equations which may have multiple equation for each single grid cell. Now we have an array of equation…

Watershed Delineation On A Hexagonal Mesh Grid: Part A

One of our recent publications is "Watershed Delineation On A Hexagonal Mesh Grid" published on Environmental Modeling and Software (link).
Here I want to provide some behind the scene details of this study.

(The figures are high resolution, you might need to zoom in to view.)

First, I'd like to introduce the motivation of this work. Many of us including me have done lots of watershed/catchment hydrology modeling. For example, one of my recent publications is a three-dimensional carbon-water cycle modeling work (link), which uses lots of watershed hydrology algorithms.
In principle, watershed hydrology should be applied to large spatial domain, even global scale. But why no one is doing it?  I will use the popular USDA SWAT model as an example. Why no one is setting up a SWAT model globally? 
There are several reasons we cannot use SWAT at global scale: We cannot produce a global DEM with a desired map projection. SWAT model relies on stream network, which depends on DEM.…