Skip to main content

Posts

Showing posts from 2020

Algorithm design for the streaming burning based on MPAS mesh

To generate a stream network that is realistic with the NHD flow line, we need to recondition the DEM, or the so-called "stream burning" into the DEM.
In the classical hydrologic model, streaming burning and depression filling are separate steps. Sometimes iterative operations are also required because either step will modify the DEM.
In the MPAS mesh, these two algorithms must be rewritten. So the new question is: can we achieve depression filling and streaming burning in one single step? What is the implication behind that? Can we design a scenario that this will make a difference?
If there is a difference, does merging them into one single step will resolve this issue?
If the above assumptions are true, then we need to consider the following challenges.
The existing depression filling algorithm starts with boundary/edge, so the stream burning must follow the same strategy;The stream grid flag must be burned into the MPAS mesh in advance, fortunately this requirement is met by …

Why we need a unified mesh for Earth system model

In many of my early posts, I shared some aspects that why I decided to develop the hexagon based watershed model, HexWatershed. But the real motivation lies beyond watershed and I'd like to share some personal motivation why I started this project.
For decades, since human invented computers and hydrologists invented hydrologic models, we rely on computational techniques to simulate hydrologic processes. And these techniques mostly use the structured mesh/matrix to describe the real world. Nearly all of us are familiar with the so-called X-Y-Z 3D domain.
We stand on the shoulder of giants but we seldom question them.  It is true in earlier days that most of our research activities focus on a catchment scale or plot scale. We have to admit this is important because we need to understand the fine-scale process before we can extrapolate to a larger domain.
We never look at hydrology on a global scale using a global method. Most of us are simply trying to copy the watershed scale method t…

Hybrid modeling for Earth system model

Recently I have been thinking about some philosophy in how model the real world.
For many of us working on the Earth system model, we spent great efforts trying to understand the physical world. For example, we now have a reasonably good idea about how water flows within the system.  
Based on our understanding of the physical laws, we built quite a few of process-based models to model and predict different types of systems.
On the other hand, there are also processes that are not "simply" governed by physics. For example, human activities, or some stochastic processes like wildfire. 
For some processes, it would be more beneficial to use agent-based modeling (ABM) approach. For example, how an individual tree interacts with its surrounding environment, fighting for water and nutrients, should be possible to be implemented using the ABM.
Other processes, such as animal behavior models, are also similar to ABM.
The new question is: how can we merge the process-based model, or equat…

Depression filling on a sphere

One of my interests and projects is about flow routing on a global scale or a sphere.
This work has several major challenges awaiting to be resolved. Here I will explain a few of them.

I want to provide some background, we are not using traditional square mesh to cover the sphere. Instead, we will be using DGGS grid, and most grids will be hexagons, you can take a look at my previous posts.


On a sphere, land/continents are not continuous, a landmass may be broken into several islands. Because of the unique structure of hexagon mesh, parallel computing becomes difficult. It is not straightforward to decompose the global mesh into regular tiles similar to square grid mesh. DEM remap/resamaple method needs to be improved. Current DEM dataset is still in the square grid format, a rigorous remap method is needed to assign elevation to new hexagon grids.

Watershed Delineation On A Hexagonal Mesh Grid: Part B

This is the second part of a study, the first part can be accessed from Here.
So we decided to use the ISEA alike grid, a hexagon mesh grid to address these issues.

(The figures are high resolution, you might need to zoom in to view.)

What follows is something I should have learned if my major was hydrology. Even though we have been using ArcSWAT, ArcHydro many times, we seldom really looked into the algorithms because they are mostly straight forward.

But when we decided to try this on a different mesh grid, a lot of details start to emerge, and through which we also learned a lot.
The overall workflow is pretty simple: But because the grid is different, we made several improvements. First,  we need a new index system, and we need to rebuild the neighbor information.

We also need to do the depression filling, here we used the priority flooding method. This method is pretty impressive in this application. Without getting into details, this is how it works:

After this, we are able to c…

Watershed Delineation On A Hexagonal Mesh Grid: Part A

One of our recent publications is "Watershed Delineation On A Hexagonal Mesh Grid" published on Environmental Modeling and Software (link).
Here I want to provide some behind the scene details of this study.

(The figures are high resolution, you might need to zoom in to view.)

First, I'd like to introduce the motivation of this work. Many of us including me have done lots of watershed/catchment hydrology modeling. For example, one of my recent publications is a three-dimensional carbon-water cycle modeling work (link), which uses lots of watershed hydrology algorithms.
In principle, watershed hydrology should be applied to large spatial domain, even global scale. But why no one is doing it?  I will use the popular USDA SWAT model as an example. Why no one is setting up a SWAT model globally? 
There are several reasons we cannot use SWAT at global scale: We cannot produce a global DEM with a desired map projection. SWAT model relies on stream network, which depends on DEM.…

A workflow for distributed parallel data analysis on HPC with checkpoint

A typical task we do nowadays is to submit a job to the cluster to run some data analysis. But there are some limitations we can do as I know, to some extend.

Lots of tasks take a long time to run, which means the Walltime must be large even with multiple cores;HPC queue is busy and it takes forever to wait in the queue;If a job failed, we have to start over; Therefore, I have designed a protocol with workflow to resolve these issues. It uses MPI for parallel computing, so we can make use of multiple nodes to speed up;It provides a checkpoint feature, so it can restart if something went wrong;It supports automate resubmit if the Walltime is not enough.
There are several implementations depending on the system. For example, on the SLURM system, a recurring job method can be used.
This design is expected to be able handle normal operations. However, there is a catch. It makes some assumption about the work load of individual slave node: it assumes that within each walltime, all the slave …

The life span of a project

Recently I have been working on a project that I need to prepare some maps. I had to go back to use some code I wrote almost 5 years ago. Then I realized that this happened to me many times.

I once read a line saying "If you think you are going do this again in the future, you should write a code for it". This is pretty much the major reason that drivers me to write lots of codes.

Sometimes I write code for fun, to demonstrate or test some ideas. (Which I plan to share some example in another post). For example, I wrote a program to test different method for carbon cycle using Explicit method or matrix method (Link).

But for most time, I wrote code so I don't have to manually do something. Just image if you have to open 10k excel files to do some simple math!

(I never wanted to write a programming language, at least for now. Most of my codes work to solve a real world problem.)

And more importantly, I always look ahead.

For example, if I decide to write a program to cal…

A review on dissolved organic carbon modeling

Researchgate and Mendeley recommend some nice papers to me based on my own publications. Here are some quick review on the DOC modeling I read today.

The first one:
Simulation of dissolved organic carbon concentrations and fluxes in Chinese monsoon forest ecosystems using a modified TRIPLEX-DOC model
This model is unlikely ready for spatial simulation. It does not consider DOC from litterfall as well.
There is some confusion about the term "DOC leaching", which should include both from litter and soil.

The second one:
ORCHIDEE MICT-LEAK (r5459), a global model for the production, transport, and transformation of dissolved organic carbon from Arctic permafrost regions–Part 1: Rationale, model description, and simulation protocol
This model seems to very complex in terms for DOC modeling. It does capture some most important processes. But some statements are not convincing due to the complexity of the model. Also, it does not consider lateral flow process well enough.

Compare…

Paper discussion Streamflow in the Columbia River Basin

Streamflow in the Columbia River Basin: Quantifying Changes Over the Period 1951‐2008 and Determining the Drivers of Those Changes
(https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1029/2018WR024256)

Routing Application for Parallel computatIon of Discharge (RAPID) is a matrix based river routing model.

My concern is how to map ELM runoff to RAPID because the scales of these two models are very different. ELM is usually at 50km to 100km and RAPID might be around 100m. If the resolutions are different, great uncertainty might be introduced.




A review on litter decomposition modeling

Litter decomposition involves series of processes.

Hydrological control nearly occurs throughout the whole decomposition process. As a result, if there is water-soluble materials in the litter, the leaching will take them out.

The water-soluble materials include Dissolved Organic Carbon (DOC).

Also some particulate materials (Particulate Organic Carbon (POC), etc.) may leach out.

The impact of ice content on hydraulic conductivity

In most aquifer test, the hydraulic conductivity is a function of the material, but it does not explicitly consider the impact of ice content on K.
When soil or any material is partially or fully frozen, its actual K will decrease significantly. Here I want to explore how to model this impact in soil hydrology model. Special attention will be paid to the different impacts on vertical and horizontal K, respectively.