Skip to main content

Ecosystem modeling: uncertainty quantification

I recently read a few news articles of skeptical attitudes towards climate change, and I decided to write something about it. I am not trying to promote climate change, but instead I want to point out there is great uncertainty in our current Earth system modeling.

Uncertainty quantification (UQ) is an important step in most numerical simulation processes. In general, a simulation without uncertainty quantification is less convincing. (I was surprised that an invited speaker in department seminar said he doesn't care about uncertainty.)

However, how to conduct uncertainty quantification itself is a challenge, especially for highly nonlinear ecosystem modeling.

First, I will invite you to read the Wikipedia UQ here:
which serves as an overview of the concept I will discuss below.

Uncertainty comes from lots of sources and nearly every step we take in our Earth system modeling has uncertainty. We, scientists as well as engineers spent great efforts to reduce the uncertainty. That is why climate modelers always try to improving our estimates year by year. As a result. you may read papers like "something is potentially underestimated previously". (That is also why you see 7-day weather broadcasting is always changing.)

We don't know what we don't know. That is why we may never be able to consider everything, which is reasonable for scientific research. Decades ago, we didn't consider well the groundwater and surface water interactions in most hydrology modeling. Today, we still do NOT always consider!

Another big issue in Earth system modeling is that we are generally reluctant to spend too much efforts in UQ. Peer reviewers are not always interested in your UQ. However, they must be interested in your scientific questions and results. In an environment that publication is the major measure of your academic accomplishments, the quality of science is likely weakened.

Unlike experimental science, Earth system modeling is very much like a black box. Indeed you can see the source code of the model, but you can seldom reproduce the whole research since there are lots of pre-processes and post-processes involved but yet unveiled.

Assumptions. We made too many assumptions. There are many reasons we have to. One of the reason is the computational demand.

We always want more data. But the reason we need to simulate is we don't have enough data and there is also some data cannot be measured directly. Even we measured data, they are not true values in most cases.

If you look at the climate change prediction again, you can tell where the uncertainty comes from better.

  1. We don't have high quality input data;
  2. We made assumptions in lots of processes;
  3. We ignored lots of processes;
  4. We don't know lots of processes we don't know yet;
  5. We made mistakes or even misconducts.

However, that is also why we are working hard to reduce the uncertainty.  A few years back, the weather broadcast was as good as today's? I doubt.


Popular posts from this blog

Spatial datasets operations: mask raster using region of interest

Climate change related studies usually involve spatial datasets extraction from a larger domain.
In this article, I will briefly discuss some potential issues and solutions.

In the most common scenario, we need to extract a raster file using a polygon based shapefile. And I will focus as an example.

In a typical desktop application such as ArcMap or ENVI, this is usually done with a tool called clip or extract using mask or ROI.

Before any analysis can be done, it is the best practice to project all datasets into the same projection.

If you are lucky enough, you may find that the polygon you will use actually matches up with the raster grid perfectly. But it rarely happens unless you created the shapefile using "fishnet" or other approaches.

What if luck is not with you? The algorithm within these tool usually will make the best estimate of the value based on the location. The nearest re-sample, but not limited to, will be used to calculate the value. But what about the outp…

Numerical simulation: ode/pde solver and spin-up

For Earth Science model development, I inevitably have to deal with ODE and PDE equations. I also have come across some discussion related to this topic, i.e.,

In an attempt to answer this question, as well as redefine the problem I am dealing with, I decided to organize some materials to illustrate our current state on this topic.

Models are essentially equations. In Earth Science, these equations are usually ODE or PDE. So I want to discuss this from a mathematical perspective.

Ideally, we want to solve these ODE/PDE with initial condition (IC) and boundary condition (BC) using various numerical methods.

Because of the nature of geology, everything is similar to its neighbors. So we can construct a system of equations which may have multiple equation for each single grid cell. Now we have an array of equation…

Lessons I have learnt during E3SM development

I have been involved with the E3SM development since I joined PNNL as a postdoc. Over the course of time, I have learnt a lot from the E3SM model. I also found many issues within the model, which reflects lots of similar struggles in the lifespan of software engineering.

Here I list a few major ones that we all dislike but they are around in almost every project we have worked on.

Excessive usage of existing framework even it is not meant to Working in a large project means that you should NOT re-invent the wheels if they are already there. But more often, developers tend to use existing data types and functions even when they were not designed to do so. The reason is simple: it is easier to use existing ones than to create new ones. For example, in E3SM, there was not a data type to transfer data between river and land. Instead, developers use the data type designed for atmosphere and land to do the job. While it is ok to do so, it added unnecessary confusion for future development a…