Monday, February 27, 2012

Il corso di Idrologia (My hydrology class)

In questa pagina, aggiornandola costantemente, carico il materiale del corso di Idrologia per gli Studenti della laurea triennale di Ingegneria Civile ed Ambientale, prodotto da me, da Giuseppe Fornetta e da altri collaboratori. Le slides saranno visibili pubblicamente a tutti e caricate su Slideshare (e non vanno confuse con quelle dei corsi passati) dal quale si potranno scaricare liberamente. I documenti pdf dovranno essere scaricati direttamente senza un preview.   Il materiale è distribuito con licenza Creative Commons.

Material is in Italian. However, I will provide soon a page where all of it will be quietly translated in English (older versions can be found at: http://www.slideshare.net/SlidesAboutHydrology/presentations/).


1   - Introduzione al Corso
2   - Introduzione all'idrologia
3   - Introduzione alla geomorfologia e alla delineazione dei bacini idrografici
4   -  Introduzione ai GIS e ai JGrasstools^0. Usare QGIS.
5   - Misura e rappresentazione dei dati idrologici.
6   - Inferenza e statistica descrittiva. Audio 2014 (24.5 M). Audio 2015 (30.7 Mb). Una guida a fumetti sull'argomento.
7   - Un ripasso di probabilità.
8   -  Introduzione ad R (the old slidesthe new scripts). Un paio di dispense utili: I e II)^1^2
9   - Le precipitazioni
10 - Un po' di interpolazione spaziale (ho usato le slides di M. Alberti). La dispensa di G. Raspa è un'ottima introduzione alla geostatistica. Audio (24 Mb)
11 - L'acqua nei suoli e nel sottosuolo (a cui è dedicato un nuovo post)
12 - La generazione del deflusso superficiale
 *  - Deflusso superficiale
 *  - Deflussi nei canali
13 - La radiazione solare (una piu' recente versione, delle slides, ma in inglese, si trova qui)
14 - L'evapotraspirazione.
15 - La neve in quattro parti
16 - Effetti del cambiamento climatico sull'idrologia
______________________________________________________________________________
Fuori Programma:

17 - La moderna teoria dell'idrogramma Istantaneo Unitario e il modello Peakflow (e la sua teoria)
18 - Elementi di telerilevamento (Claudio Persello): I e II
19 - Qualcosa di (relativamente) semplice sul franamento a scala di bacino
______________________________________________________________________________


QUI I RISULTATI DELLA PROVA INTERMEDIA 2012
QUI I RISULTATI DELLA PROVA INTERMEDIA 2013
QUI I RISULTATI DELLA PROVA INTERMEDIA 2014
QUI I RISULTATI DELLA PROVA INTERMEDIA 2016
______________________________________________________________________________

Alcune relazioni di esempio:

Relazione geomorfologica

Le relazioni sono state calcolate con i JGrasstools e non tutte le analisi potrebbero essere ottenibili con altri software.

Serafin-Ridolfi (91 Mb).
Fedrizzi-Zurlo (6.1 Mb)
Trombini-Djisseko-Dziali (14.2 Mb)

Relazione pluviometrica (con R)

Vieceli (3.7 Mb)
Lena-Santoni (2.2. MB)
Cumer-Nessenzia (1.4 Mb)

______________________________________________________________________________
^* Domande del test intermedio 2012
     Domande del test intermedio 2013
     Domande del test intermedio 2016

^0  - Qui trovate alcuni DEM per esercitarvi e per confrontare, nel caso, i vostri risultati:  il rio Valpiana (100 Mb)

^1 - Il mio post su R può servire per partire. Sul sito di R si trovano varie risorse per imparare ad usare R. Un gruppo italiano di utenti di R è Rante e li' vi si trova anche un manuale introduttivo ad R.  Tra gli altri strumenti, in inglese, ci sono quelli che potete trovare qui.   I contributi ad R si susseguono così velocemente che ogni giorno ce ne sono di migliori. Quindi tenete d'occhio il web. C'e' anche una versione del libro di Matloff, The Art of R programming.

^2 -  Trovate al link il file delle portate 1990-2005.txt  e il file della Pluviometria di Paperopoli (Unix/Mac o  MS-Windows) utilizzati a lezione. Qui, invece,  lo script di R con tutti i comandi eseguiti nella lezione del 2 Aprile 2012



Thursday, February 23, 2012

Which hydrological model (is better ) ? - Q & A


A few months ago, I received an email where the following questions were posed, in the context of simulating the discharge of many (small) catchments in Spain. The author thought, and I agree with him, to generate a sequence of meteorological forcings with some models he did not specified^1 to produce time series of hydro-meteorological data with assigned statistics, and then use them to drive some hydrological model. To this respect he asked:

"Q1. Will you suggest, from the point of view of computational time, to use distributed models (like SHE) and continuous, since we think to use weather time series of  thousands of years ? Personally I see the danger to be overwhelmed by data, and by so long computational time that we will not able to perform all the analysis we require with the adequate rigor (sensitivity analysis, and so on ...). "

A1 - Different people have different ideas of what a distributed model is. Kampf and Burges (2007) offered a review a few years ago. However, taking as reference our GEOtop, that is probably one of the more complex existing hydrological models, we can observe that it runs, in our laptop, a year long simulation for a 10-20 square kilometer basin at 10 m of resolution, in, say, a day. So, simulating 1000 years would require approximately 3 years: which is clearly too long for any project. Using faster machine would probably increase the time by a factor of two. GEOtop is not parallelized, so, after an investment in rewriting the code, we could probably cut the time of simulation of a factor 100, by using also large parallel computers. Thus, we will reduce one year of simulation to 3/4 days: this could then be feasible. But this is obviously wishful thinking. Other models, like SHETRANDHSVM, tRibs could be probably be already faster than GEOtop, but possibly more inadequate than GEOtop to simulate some of the processes. Besides, timing above does not include the (wo)men/months required for characterizing the basin and the data collection (and organization), which would also use other time.  So, at present, would be impossible to use GEOtop for such a task, maybe  some other optimized model, in a multiprocessor server, could. It remains, however, a long term objective to pursue for us. ^2
  

Q2.  Would it be more feasible to consider lumped or semi-distributed model ?  In this case, considered that the interest consists in describing only the hydrological behavior during floods. Which kind of infiltration scheme would you suggest ? Would you suggest a model like the Sacramento Soil moisture Accounting model ? Would it be enough accurate in its forecasting ? O would it be better to use some model based on the solution of Richards equation, maybe using a space averaged characteristics curve ? 

A2 - I gave the answer responding to Q1:  said that any model is, in a sense, a conceptualization, you need to choose an approach more light than GEOtop. I responded to this question also in my presentation given in Montpellier.  So, you need to use a semidistributed model.
Using a solver of Richards equation would make you essentially return to option of Q1 with its problem, unless you think to a 1-D solver coupled with a 2D groundwater equation. In any case you would further be required to have a module for runoff and channel routing to patch together with subsurface flow modules. Excluding that you can do it by yourself, you need to find the model that already does it. tRibs, or TopoFlow should, for instance, fall in this category. We are working toward a similar solution with our integrator of the Boussinesq equation: but is still "work in progress".
Thinking to semidistributed models for solving your problem, you need to look at a model like our JGrass-NewAGE. There, however, "the processes at hillslope scale are strongly conceptualized, where rapresentation of the physics is minimised, and the resolution in space and time is maximized, and the focus is upon predicting emergent behaviors rather than system details" [Lanni, 2012].  Other models of reference are Topkapi, and those used in the Distributed Model Intercomparison Project (DMIP) [Reed et al. 2004]. I do not know how the Sacramento Model really performs, since I never used it. But I my inclination is for other models of more modern conception.^3 

Q3.  In relation to Q2 do you think that the theory by Reggiani et al., relative to the REW  concept  (Representative Elementary Watershed) is mature enough to be used outside Academy ?

A3.  Obviously Paolo would say yes, it is mature, but you have to read his literature to understand if. One limit I found in Paolo, Siva and Majid original paper was that the identification of the REWs was left unresolved. I know that Paolo also deployed a model (which I do not know in detail), but the conditions of its availability are not clear. Similar to the REW idea, probably formulated in a less fancy way, is the concept of Hydrologic Response Units (HRUs), of which you can find references in the JGrass-NewAGE paper, and has many implementations in the work, for instance,  of Peter Krause ad Daniel Viviroli. Also Roger Moussa'a MITHAS, whilst in a different context, follows the same idea.

In any case you have to decide which is the time step at which you want your response. I was assuming that you were interested in relatively small catchments, and therefore is mandatory to have hourly, or sub-hourly discharges. If you are interested in a more aggregate time response, i.e. daily discharges, other models could work. Personally I have prejudices against this kind of "physical" models, in the sense that, I believe, the physics of flood generation, in small catchment, works at smaller time scales than the day. However many models, SWAT is one,  seem to work^4. 

4. In the case, I will decide to use semi-distributed models, which method of IUH and infiltration would you use ? 

A4 - Using the IUH is even a different game. See the reference here, for instance. IUH, or GIUH are models of flow peaks, where many assumptions are made, and granted for valid.  However, the theory, as you noticed, left out the determination of the infiltration. We have a  model, called Peakflow, of the IUH, and there, we use a provocative Dunnian saturation excess scheme for generating runoff. Indeed in Peakflow one can theoretically use the methods s/he wants (and you remain with the problem to choose one), even classical bucket type models. But, so far, we did not implemented it. Some friends use SCS but  they take the curve numbers out of a calibration process, and not from the Tables of the original system. Others use a bucket model (they call Green-Ampt if they regulate the filling of the bucket with the Green-Ampt scheme). These last methods can be directly applied to cut the rainfall, and to produce an effective precipitation.  In any case, I would prefere a continuous method for your modeling, and use GIUH or IUH for controlling the results. A definitive guide on GIUH is this paper.

Q5. A model based on the topographic index or the TOPMODEL assumptions would be indicated for this type of analyses ? 

A5 - The topographic index scheme is a runoff production scheme^5 more that a rainfall-runoff model. It comes with its own limits. It was early turned into a rainfall runoff model [e.g. Beven, 2000, Franchini et al., 1996] but the core theory does not deal with flood wave propagation and aggregation (the GIUH does). Indubitably, it works^4 for producing the volume of the rainfall-runoff, and I, myself, used the topographic index to assess the initial wetness condition in a catchment in the Peakflow paper (and model) and in D'Odorico and Rigon [2003]. The reason it works is that, for small basins, as we argue in some of our papers, residence time of water in channels is negligible compared to residence time in hillslope. Therefore, when you account properly for the timing of the runoff production, and since the production of discharges is an aggregation process in which, at the end, just the total of "effective volume" counts, you have treated most of the information you need to accomplish the task of predicting discharges.  In any case, to be really usable, the topographic index, needs to be integrated with other features. In the past some did it but I would not push it further. I would use, at least, a dynamic topographic index, as suggested by Barling Moore and Grayson [1994] and improved by many others.

Finally, do not forget that where mountains area are, snow accumulation (as source of abstraction of precipitation) and snow melting (as source of discharge) are important issue to resolve. This complicates the scenery, and makes the many models that do not account for snow, not really usable.

If you read this post, you are probably interested also in the Reservoirology one.

Notes

^1 - That of weather generators is, indeed an interesting topic, that I will cover sooner or later. 

^2 - Other issues regards, the calibration of GEOtop. Even if it is a distributed model, parameters in equation, are certainly effective (e.g. Beven, 1989), and therefore a certain calibration is needed for it to properly reproduce fluxes. Calibration is a time consuming process, which could also be overwhelming in a context of distributed models like GEOtop. In fact, calibration issues experts, often use in their paper usually very unrealistic models, which cannot be taken seriously by those who wants to be use hydrological model operationally. 

^3 - Any of my colleagues has his model, not a proof of maturity of our community, indeed: but many of them work fairly well, few of them are really available, and even less usable at operational level. So in the World, all use HEC-HMS. What we are trying to do with our involvement with OMS3 is changing this situation (e.g Post on : Going beyond the present state-of-art, Reproducible Research).

^4 Klemes in his "Dilettantism in Hydrology: Transition or Destiny", argued that: "MODELS THAT WORK WELL are THE GREATEST DANGER TO PROGRESS IN HYDROLOGY. For a good mathematical model it is not enough to work well. It must work well for the right reasons ..."

^5 Or, if you see it from the soil point of view, is a subsurface flow model.

Thursday, February 16, 2012

GGPlot: a way to do good graphics and understand data

Making good graphics is very important to understand data and to appropriately promote research achievements. ggplot2 is a recent R package of which there are several good tutorial and introductions. But recently a youtube webinar was posted by the very same author of the package, Hadley Wickham. Spending a little time to listen to him is really worthwhile. Click on the picture to go on the page where the video, the slides, and other information are.





Hadley kindly provided the following resources in his slides:
ggplot2 mailing list: http://groups.google.com/group/ggplot2
stackoverflow: http://stackoverflow.com/tags/ggplot2
Lattice to ggplot2 conversion: http://learnr.wordpress.com/s=lattice
Cookbook for common graphics: http://wiki.stdout.org/rcookbook/Graphs/
ggplot2 book: http://amzn.com/0387981403

Sunday, February 12, 2012

Geopaparazzi 2.6.0 is out

Just look at:

http://jgrasstechtips.blogspot.com/2012/02/geopaparazzi-260-is-out-osm-mixare-and.html
for getting what it does. And it does a lot, for free, for any Android cell phone !


Geopaparazzi is a tool developed to do very fast qualitative engineering/geologic surveys. Even if the main aim is in the field of surveying, it contains tools that can be of great use also to OpenStreetMappers as well as tourists that want to keep a geo-diary.
Geopaparazzi is now available on the Android Market. Search for geopaparazzi on your phone or get it from the online android market.

Geopaparazzi is developed by HydroloGIS.

Ignacio Rodriguez-Iturbe talk at Berkley in 2007 about Hydrology in 21st century

A talk of one of my masters, Ignacio Rodriguez-Iturbe, where he shows some of the work we did together more than ten years ago.

Obviously he, and Andrea Rinaldo were illuminating my way: but it was also my own way. It is always a pleasure to hear "El Jefe" talk.


Ignacio's talk starts around 6:10 m. The reference for the work to which I participated it is "The Book": Fractal River Networks, chance and self organization, Cambridge University Press, 1997.  Ecohydrology is partially covered in Ecohydrology of water controlled ecosystems: soil moisture and plants dynamics.

Thursday, February 9, 2012

Reproducible Research (and papers)


Many papers I read in hydrology present research that is very difficult to reproduce. Because, as a scientist, I would like to reproduce the results of what I read (this is science, indeed!) some rules should be followed. I found this group of scientists that initiated a Reproducible Research web site, especially directed to image processing colleagues, but easily extendible to Hydrology, and related fields. They offer a how to guide, which is verbatim reported here below:



"Of course, it all starts with a good description of the theory, algorithm, or experiments in the paper. A block diagram or a pseudo-code description can do miracles! Once this is done, make a web page containing the following information:

  • Title 
  • Authors (with links to the authors' websites) 
  • Abstract 
  • Full reference of your paper, with current publication status, and a PDF of your paper 
  • All the code to reproduce all the results, images and tables. Make sure all the code is well documented, and that there is a readme file explaining how to execute it 
  • All the data (images, measurements, etc) to reproduce all the results, images and tables. Add a readme file explaining what the data represent 
  • A list of configurations on which you tested your code (software version, platform) 
  • An e-mail address that people can use for comments and remarks (and to report bugs) 

Depending on the field in which you work, it can also be interesting to add the following (optional) information to the web page:
  • Images (add their captions, so that people know what Figure xx is about) 
  • References (with abstracts)
For every link to a file, add its size between brackets. This allows people to skip large downloads if they are on a slow connection.
For examples, see this list of reproducible papers at LCAV, EPFL. "

Obviously they also link a blog and links to various RR resources. RR, incidentally, the same as my initials.

Now there is also a book about reproducible research made with R by Christopher Gandrud. The chapter can be found here.

Monday, February 6, 2012

GEOtop history up to the first public realease: part II

First Part

GEOtop 0.75

(mostly financed by COFIN 2001, THARMIT Eu Project, CUDAM - CofinLab 2001, ASI 54/2000)

With the Master thesis of Giacomo Bertoldi [2000] it was decided to throw away the PM equation for estimating evapotranspiration (actually we kept it for comparison), and to solve directly the energy budget in any point of the basin. This was the birth of GEOTOP 0.75 which is thoroughly documented in Bertoldi et al. [2011], and BTW in Rigon et al. [2006]. GEOtop 0.75 was actually a SVAT model plus an rainfall-runoff model coupled together, and was able to predict, besides the "normal hydrological observables", soil temperature.

GEOtop 0.75 needed several parameters to be run, however the modeling could have been considered parsimonious,  if its complexity was compared with its prognostic capabilities.   The user could switch-off the SVAT part and have a parametrically parsimonious rainfall-runoff model, or vice-versa s/he could switch-off the rainfall-runoff model and have a reasonably simple SVAT model. 
Large efforts were  done in cleaning the old code and improving the input-outputs quality. Wigmosta et al. [1994] original model is very similar to the version 0.75 of GEOtop (but was mostly a case of evolutionary convergence since the comparison came after the GEOtop implementation). 

In doing this version of GEOtop, we wanted to pursue the modelling of the entire surface hydrological cycle, even if a reasonable snow modeling still had to arrive. We were also looking for having a good tool for doing eco-hydrology. The revamp of eco-hydrology was in fact already seeded in Rodriguez-Iturbe's mind when I was working with him at the Texas A&M University (1994-1996), and  I was fully aware of his ideas from the beginning of the project. . 

For validation of the model, Tom Over  had a key role.  He, besides giving a lot of ideas for making the concepts behind the model clear, suggested to use the South Great Planes 97 experiment data set: that we promptly did as it appears in the first journal paper about the model Rigon et al. [2006]. In fact, one does not realize how scarce are hydrological data if he does not have a good model to cope with them.

Based partially on GEOtop 0.75 (and on the subsequent GEOtop 0.875) came all the work by Reza Entezarolmahdi who tried to create an automatic calibration system for the model. He used MOSC-EM  which, however was never really integrated in the model.  The work of Reza was really interesting for many points of view, but probably a little early with respect to our times, and scheduling. 


GEOtop 0.875

(mostly financed by TIDE EU Project, CUDAM Cofinlab COFIN 2001, THARMIT EU project)

This version finally achieved the goal of having  a snow accumulation and melt model (derived by the Utah Energy Balance -UEB- by Tarboton and Luce, [1992]) implemented by Fabrizio Zanotti [2003] with the help of Giacomo Bertoldi.  It also included a post-processor, the S-FACTOR (implemented by Christian Tiso [2003]), which used information on soil moisture to estimate the triggering of instabilites in hillslope^1.  Both the implementations were the outcome of two M.S. thesis.  Snow-melting and soil freezing are essential components in the hydrological cycle of mountain catchments and cannot not be overlooked. Landslide and debris-flow triggering are also an issue with particular relevance in mountains areas, such that floods in mountain areas are usually the combined effect of large liquid and solid discharges whose effects cannot be separated: GEOtop with this version started to be a reasonable a tool for studying all of these phenomena.  The reader can appreciate the differences among the previous version of GEOtop 0.5 and this one. Turbulent fluxes were now evaluated with a proper schematization of PBL and heat equation was quietly included in the picture; snow was modeled with a physically based (mono layered) model. 

With the version 0.875, let say 0.875b, we  also attack the problem of a sound hillslope-hydrology modeling, as ancillary to studying hillslope stability. The following is the rational behind the project, in words I wrote at that time.

"So far , our understanding of mountain catchments in fact has been based on hillslope hydrology, as reviewed for instance in Wipkey and Kirkby [1978], and  the "perceptual" hillslope model derived from the assumptions: that it is possible to neglect the transients in the water fluxes [in the sense clarified in Iverson, 2000]; that topographic gradients dominate the hydrologic response; that hydraulic conductivity strongly decreases with depth in the soil, and, not independently, that runoff occurs mostly owing to saturation excess. 
This last assumption, in turn, is  based upon the results of a long series of experiments from the late seventies on by American geomorfologists (Dunne, Black, Dietrich, Montgomery, Torres), and was supported by many others (among these: Moore, Grayson, Sivapalan, Wood). 
These above studies and research activities changed the belief spread by Horton that runoff was mostly due to a infiltration excess mechanism. Developing hydrological models based on saturation excess ideas (which involved further simplification) originated a series of models among which the already cited TOPMODEL [Beven and Kirkby,1979; Sivapalan et al, 1991; Franchini et al, 1996] is the most successful product in the rainfall-runoff area and SHALSTAB in landslide mapping. 
 There are many aspects that could be improved with respect to these modeling strategies. 
First of all:  in characterizing  the subsurface flow field more in terms of total head, even if in simplified form as in Iverson [2000], than in terms of the topographic gradient, as a first step toward the integration of the three dimensional Richards equation."

This step was implemented by the master thesis of Davide Tamanini [2003] that made GEOtop able to simulate transient subsurface flow, and both saturation excess and infiltration excess runoff.  A first parameterization of the soil water retention curves was also implemented in the model. It is this code that was used in the first journal papers on GEOtop (Zanotti et al., 2004, Rigon et al., 2006, Bertoldi et al., 2006). Upon this code was based the work by Silvia Simoni, helped by Fabrizio Zanotti, which produced the GEOtop-SF postprocessor that was able to estimate statistics of the stability of a hillslope. This work was realised to complete Simoni Ph.D. thesis  and Simoni et al., [2008]

In GEOtop 0.875 the integration of Richards equation followed a custom numerical scheme that was exceedingly complicate and non standard. Moreover, the integration scheme was not fully 3D, but could have been defined 2D + 1D, where "2D" stands for the lateral flow, obtained by using the Darcy Buckingham law, and 1D was the resolution of a one dimensional Richards equation. Despite these limitations, we could obtain reasonable reproduction of soil moisture distributions,  good discharges at the outlet of basins, excellent reproduction of summer soil temperatures, and what we considered a good reproduction of turbulent heat and evapotranspiration exchanges. 



GEOtop 0.9375 and subsequent versions till the first public release

(mainly financed by Projects with Servizio geologico PAT, progetto MORFEO by ASI, and EU IRASMOS and  AQUATERRA projects)

The new development started with in mind that we had sooner or later to switch to a version of GEOtop with a full 3D integration of Richards equation. Ex-post good results in our simulations for validating the code were  not enough to really cope with mainstream literature. 
However, the first new improvement of GEOtop was in the direction to include a multiple layer  modeling of snow and a first core of the freezing soil subroutines.  This task was mainly accomplished during the Ph.S thesis of Stefano Endrizzi [2007], and greatly improved in his subsequent work at Saskatoon, working with Phil Marsh and Bill Quinton, and recently at Zurich University collaborating with  Stephan Gruber

Why complicating even more an already complex model ?  Moreover, why getting a new snow model, if the Utah energy balance seemed to work fine, as written in Zanotti et al., 2004?

The rational behind this choice was essentially that the snow water equivalent was not enough for comparing snow measurement in the field with model outcomes. Clearly snow water equivalent (SWE) was enough just for those willing to cope with total water volume generated after snow melting, but not sufficient for studying and understanding the processes behind snowpack evolution and ablation. Nor even for having a reasonable estimate of soil temperature under the snow, and other interesting prognostics, like snow density. 

However, Stefano's efforts were not limited to snow. He worked hard, for getting a consistent integrator of Richards equation, that he based on a Newton Krilov-method (Kelley, 2003).  Besides he decided to change the surface water flow  equation and numerics, by using the shallow water equation integrated with a robust but explicit method.  This resulted in a very stable and reliable code that constitutes the core of the first public version of GEOtop. In fact Stefano decided to move from the the fraction of geometric series of  1/2 to the integer 1.

The collaboration of Stefano and Matteo Dall'Amico produced also a consistent integrator of the freezing soil moisture, after that Matteo, in his Ph.D thesis disentangled, at least for ourselves,  a lot of thermodynamics (and together, we introduced a simple, and "normal" thermodynamic notation). This work has been documented Dall'Amico Et Al., 2011



Various Directions 

Up to version 0.875 version GEOtop was pretty much a home made effort, mainly pursued by Master and Ph.D students of Trento University. However, since then, either because the Ph.D students became doctors and spread around, and because others discovered the potential our work, GEOtop started to became really internationally used. Han Xunjun in China used GEOtop in his data assimilation system implementing an ensemble Kalman filter (in Python). The Lausanne group under the direction of Marc Parlange, within the collaboration of Silvia Simoni, and the direct coding efforts of Thomas Egger implemented a real time version of the model that is giving its results everyday (http://lsir-hydrosys01.epfl.ch:22006/). John Albertson and his group implemented an erosion module to be coupled with GEOtop. In Trento, a version of GEOtop, called GEOtop-EO implemented a prototype of  infrastructure that includes besides the modeling core, a geographical database, with a raster service (built upon RAMADDA) and a visualization system based on JGrass. This was done for the project MORFEO.  At Bolzano, in EURAC, Giacomo Bertoldi and Stefano Dallachiesa, endowed GEOtop with an external (so far) vegetation dynamical model (itself came from elaboration of previous work by Albertson and Montaldo). Worth to mention in this section is also the work of Emanuele Cordano. He dedicated a lot of time in defining the "keyword schemes" that eventually, after some reworking,  become the standard for the I/O of GEOtop. He also started a parallel work on Boussinessq's equation which provided to us a first better idea of what Newton's method of integration is. Moreover, he studied and implemented the first NetCDF O (output) version of GEOtop. Last, but not least, Mountain-eering made of GEOtop the center of its business plan, and supported the completion of the freezing soil module, and is going to develop a new NetCDF input/output system (helped by Emanuele), and the creation of some data assimilation scheme of snow measures. 

Credits to work of the group of Lousanne should be given to have also included in GEOtop the METEO/IO environment which was the way to link the model to a real-time acquisition system.  Stefano Endrizzi himself, when at Saskatoon, discovered the work of Liston and Elder [2006] with MICROMET, and included it in the GEOtop distribution with the permission of the Authors. 

Also other player came  to the game. Arpa Val D'Aosta decided to make of GEOtop the principal tool for doing analysis on snow and permafrost, and it is going to support its improvement and usability. The Karlshrue Institute of Technology in Garmisch-Partenkirchen (and particularly Harald Kunstmann, and coworkers) adopted for simulation for one of their TERENO experiment.  

This, just to mention partial developments, not all of them yet flowed into the main version of the model. All of these efforts, in fact, could not being really unified in a single product. 
Notes:

^1 This eventually evolved into GEOtop-SF by Silvia Simoni, documented in Simoni et. al [2008]

Friday, February 3, 2012

GEOtop History up to its first public release: Part 1


``As scientists we are intrigued by the possibility of assembling our knowledge into a neat package to show that we do, after all, understand our science and its complex interrelated phenomena.'' (W.M., Kohler, 1969).  

I robbed this statement from Keith Beven's [2000] book: Rainfall-Runoff,  the primer, because it clearly represents one of the motivations that stand behind the development of GEOtop.  
As some of you may know, versioning, at the beginning of GEOtop development followed the geometric series of two. First version was  0.5, second 0.75, third 0.875, and so on.  The message was that such a model solution is never definitive (the version 1), and, besides, I was always intrigued by the Zeno's paradox, and by the versioning system of TeX (asymptotically approaching up Pi) and LaTeX (moving up to number e).
Who follow upfor the main development decided to change the versioning with a more normal sequence. That was the first public edition. Further details of this history can be found in the presentation I gave in Princeton CHAMDA II 2004, and can be found here in a revised version



GEOtop 0.5

The development of the first version of GEOtop (0.5) was mostly financed by the Autonomous Province of Trento through the Serraia project, and by the Italian Ministry of Research and University through the Cofin 1999 project.

The very first step was  reading and reading again  the Dara Entekabi review of modeling the whole hydrological cycle [e.g. - Marani and Rigon (eds), 1997]^1, and the development started with  the master thesis of Paolo Verardo, and his subsequent work in 1998. Around a year was spent to implement a decent model for evapotranspiration in a complex terrain environment, according to the Penman-Monteith (PM) scheme (find the formula either in Dara's lecture above, or in my lecture here), and all the necessary incoming radiation treatments. All this work was not dedicated to the PM formula which was obviously trivial to implement, but to find a proper estimation of the radiation, which included the view angle and the shadowing routine.^2

 The  problem of data  regionalization (the only data we had were those coming from traditional hydro-meteorological stations in single points and we did not have many of them) was faced for the first time. 
A part from the geomorphological data that we extract from DEMs (the "sine qua non" basis of all our work), we had to regionalize (make spatial): air-surface temperature (varying  with the elevation of the terrain), net radiation (and eventually by the evaluation of an atmospheric thickness which has to be regionalized too), and wind speed. 

In sequence it was decided to use: kriging  techniques and the hypothesis of adiabatic temperature profile (for air temperature)^3;  Brutsaert [1983] paper results (for atmosphere emissivity)^4,  and constant (or occasionally kriged) wind speed everywhere. This was performed through the use of external programs.

The heat conduction into the ground was parametrized as a linear combination of a sinusoidal function as Entekabi suggested [1997] . That work  was also inspired by the routines  of IPW (Image Processing Workbench) [Frew, 1990]. 
Actually we tried to make IPW work concurrently with the borning GEOtop: but its pervasive scripting base (scripting is good but there is a point after which it makes the code organization unclear), the discontinued support, and ignorance of IPW code internals (and just ignorance), led us to built a new system from the scratch. 

Despite of the approximations introduced, GEOtop 0.5 model worked fairly well in estimating the net longwave and shortwave radiation in any point across a basin and at any hour of the day, and could give also what appeared to us reliable estimates of the potential evapotranspiration on a daily basis. However to obtain the real evapotranspiration was a different question (and, obviously, to validate it, even a different one).  
Marco Pegoretti [1999] added to the evapotranspiration modules a rainfall-runoff model (temporary called GEOMODEL). The approach we had to the problem of rainfall-runoff  was strongly influenced by the work of Rodriguez-Iturbe and Valdes [1979] and Gupta and Waymire [1980],  and we were reluctant to abandon the simplicity of a GIUH-based model in favor of a fully distributed model (that we made later). 

Of the many ideas behind the theory of the GIUH  derives from the observation that the river basin is a complex system (an interplay of hillslopes and channels) but, at least for the forecasting of floods,  simple models work fairly well (leading to the conclusion that dynamics simplify the complexity and wipe out part of the heterogeneity underneath)^5.

The big problem in the GIUH approach however was (and is) that the determination of the effective rainfall, i.e. the correct separation of surface runoff (interpreted as the cause of the flood surge) from subsurface flow (which must be actually treated separately and routed to the channels in a slower way), especially in dependence of storm events of diverse intensity, duration and inter-arrival time. 

A model more or less contemporary to the GIUH is the TOPMODEL by Beven and Kirkby [1979]. It is based on the paradigm that runoff production is due to saturated areas (according to Dunne and Black, 1970). Thus, once one knows which areas are saturated and describes their growing during an event, the problem of the runoff coefficient is almost solved,  and routing of water to an outlet can be accomplished by some simple mechanism (the Muskingum-Cunge model at least in the original TOPMODEL papers).^6

Thus, an idea could have been to merge the best of the two formulations, the GIUH concept with the TOPMODEL. However the hypotheses on which the TOPMODEL has been based, mainly the stationarity of the hillslope subsurface fluxes,  simplifies the life of the modeler, but introduces several limitations, and needs several work-around as described in  Beven et al., [2002] to be applied with success. In fact, when the final goal of modeling,  is not simply the production of a well-fitted flood wave, but for instance the estimation of local soil moisture contents, the TOPMODEL fails to be accurate enough [e.g., Grayson and Wilson, 2002]. 
Due to its limitations, the TOPMODEL's parameters  become "effective'' parameters, and lose their original physical significance (for instance, the hydraulic conductivity cannot be validated by local field measurements), and need to be "calibrated" ex-post. Other limitations will be mentioned below when talking about the GEOTP 0.875 version. In any case, the TOPMODEL concept has been demonstrated to be a good tool to model floods in small-to -medium catchments, and has been considered the reference hydrological model for many of the researchers during the '90s. The TOPMODEL's ability to forecast floods derives also from its account (trivial indeed) of the topology and geometry of small- catchment flow paths: it was shown, in fact, that in small watersheds (up to at least 1000 square kilometers), hillslope residence time dominates the characteristic time of flood formation and that topology and geometry of river basins are sufficient with very minimalist dynamics to explain the shape of floods [e.g. Rinaldo et al, 1991; Rigon et al, 1996, Rinaldo et al., 1995, D'Odorico, 1996; D'Odorico and Rigon, 2003]. 

Upon all the above arguments, it was decided  to build a completely new subsurface and surface model, still driven by gravity (i.e. by topographic slope as the TOPMODEL and not by the total hydraulic head) but, as a first approximation to the final wishes of having a better physically-based theory, introducing the buffer to cope with infiltration into the vadose zone.  Subsurface flow was produced only by the saturated layer (including the capillary fringe), if present, and lateral surface runoff was routed as a kinematic wave (and through Manning/Gauckler-Strikler equation for velocities). 

In doing this, it was also tried  a better characterization of flow paths, with reference to the bedrock, instead of to the surface topography [see McDonnell et al., 1996]: this could be done by measuring soil depth or interpolating it for instance as in [Heimsath et al, 1997 or Roering et al, 1999; please see the overlooked  Bertoldi et al, 2006 for the details]. 

In this separation of surface and subsurface fluxes, GEOtop 0.5 was similar to the grid bases THALES [Grayson et al., 1994a,1994b] or the more recent NEWTHALES. 

At first, the version of GEOtop by Verardo-Pregoretti-Rigon (GEOTOP 0.5) could work without an explicit channel routing by assigning a locally variable roughness and hydraulic radius; if channels were determined by accurate topographic analysis, they could be explicitly treated. 

In this case, channels routing was performed by a GIUH theory as in Rinaldo et al [1995], where channel celerity and hydrodynamic dispersion were  considered spatially uniform and constant in time. 

Instead of the effective rainfall, the spatially and temporally distributed input to channels was produced by GEOtop, i.e. the GEOMODEL produced patterns of spatially distributed soil moisture, and these patterns were used to reduce the potential evapotranspiration to its real counterpart as described in Bertoldi et al., [2002].

Second Part

Notes

^1 - MIT's group, on the other side had a long tradition in this direction of modeling that originated from Pete Eagleson work in late seventies and eighties. Dara's talk in fact inherit a lot from that work. Raphael Bras, Dara Entekabi, Valeriy Ivanov, Enrique Vivoni and others themselves  implemented a model similar, in concepts, to GEOtop, tRibs

^2 Today GEOtop mostly uses Xavier Corripio's schemes

^3 We still use kriging, but also Glen Liston's Micromet [Liston and Elder, 2006] and MeteoIO 

^4 Various possibilities are present now, and they derive mainly from Stefano Endrizzi Ph.D. thesis.

^5 But this is to many respect debatable, since this statement refers to the closure of the basin, and from this going back to forecasting the internal distribution of discharge is impossible. Keith Beven wrote a lot on this. 

^6 Research subsequent to TOPMODEL, including ours (e.g. Lanni et al., 2011, 2012 for a review of problems and literature, and Cordano and Rigon, 2008 for a top-down derivation of TOPMODEL related equation from Richards' one) , clarified many of the limitation behind TOPMODEL approach.