How can we benefit from regime information to make more effective use of long short-term memory (LSTM) runoff models? - INSU - Institut national des sciences de l'Univers Accéder directement au contenu
Article Dans Une Revue Hydrology and Earth System Sciences Année : 2022

How can we benefit from regime information to make more effective use of long short-term memory (LSTM) runoff models?

Résumé

To date, long short-term memory (LSTM) networks have been successfully applied to a key problem in hydrology: the prediction of runoff. Unlike traditional conceptual models, LSTM models are built on concepts that avoid the need for our knowledge of hydrology to be formally encoded into the model. The question, then, is how we can still make use of our domain knowledge and traditional practices, not to build the LSTM models themselves, as we do for conceptual models, but to use them more effectively. In the present paper, we adopt this approach, investigating how we can use information concerning the hydrologic characteristics of catchments for LSTM runoff models. In this first application of LSTM in a French context, we use 361 gauged catchments with very diverse hydrologic conditions from across France. The catchments have long time series of at least 30 years. Our main directions for investigation include (a) the relationship between LSTM performance and the length of the LSTM input sequence within different hydrologic regimes, (b) the importance of the hydrologic homogeneity of catchments when training LSTMs on a group of catchments, and (c) the interconnected influence of the local tuning of the two important LSTM hyperparameters, namely the length of the input sequence and the hidden unit size, on the performance of group-trained LSTMs. We present a classification built on three indices taken from the runoff, precipitation, and temperature regimes. We use this classification as our measure of homogeneity: catchments within the same regime are assumed to be hydrologically homogeneous. We train LSTMs on individual catchments (local-level training), on catchments within the same regime (regime-level training), and on the entire sample (national-level training). We benchmark local LSTMs using the GR4J conceptual model, which is able to represent the water gains/losses in a catchment. We show that LSTM performance has the highest sensitivity to the length of the input sequence in the Uniform and Nival regimes, where the dominant hydrologic process of the regime has clear long-term dynamics; thus, long input sequences should be chosen in these cases. In other regimes, this level of sensitivity is not found. Moreover, in some regimes, almost no sensitivity is observed. Therefore, the size of the input sequence in these regimes does not need to be large. Overall, our homogeneous regime-level training slightly outperforms our heterogeneous national-level training. This shows that the same level of data adequacy with respect to the complexity of representation(s) to be learned is achieved in both levels of training. We do not, however, exclude a potential role of the regime-informed property of our national LSTMs, which use previous classification variables as static attributes. Last but not least, we demonstrate that the local selection of the two important LSTM hyperparameters (the length of the input sequence and the hidden unit size) combined with national-level training can lead to the best runoff prediction performance.
Fichier principal
Vignette du fichier
hess-26-5793-2022.pdf (6.39 Mo) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte

Dates et versions

insu-03869584 , version 1 (24-11-2022)

Licence

Paternité

Identifiants

Citer

Reyhaneh Hashemi, Pierre Brigode, Pierre-André Garambois, Pierre Javelle. How can we benefit from regime information to make more effective use of long short-term memory (LSTM) runoff models?. Hydrology and Earth System Sciences, 2022, 26, pp.5793-5816. ⟨10.5194/hess-26-5793-2022⟩. ⟨insu-03869584⟩
63 Consultations
30 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More