Use of neural network as a support tool in water level forecasting and issuing flash floods early warnings to three small Brazilian urban watersheds
Carregando...
Arquivos
Fontes externas
Fontes externas
Data
Orientador
Coorientador
Pós-graduação
Curso de graduação
Título da Revista
ISSN da Revista
Título de Volume
Editor
Tipo
Artigo
Direito de acesso
Arquivos
Fontes externas
Fontes externas
Resumo
One of the actions to mitigate the impacts of hydrological extremes is to issue warnings as far in advance as possible. This article reports the application of neural networks for water level forecast to three small watersheds in Brazil that are susceptible to flash floods. First, the physical characteristics and land use and cover maps of the watersheds were surveyed. Next, Multilayer Perceptrons were trained with observed water level and rainfall data covering the period 2014 to 2022 to make water level forecasts 1, 2 and 3 h in advance. To design the neural networks, different combinations of activation functions in the hidden and output layers were tested and also variations in the number of neurons in the hidden layer. The neural networks forecasts for the three watersheds test data were quite good for the three forecast horizons, highlighting the forecasts 3 h in advance that reached a Nash–Sutcliffe index greater than 0.9. In future work, neural networks will be trained with rainfall estimates obtained from numerical weather forecast models data and observed rainfall data, enabling their operational use in the National Center for Monitoring and Early Warning of Natural Disasters situation room. The operational neural models can semi-automate the flash flood warning process for the studied watersheds. As a result, the warnings effectiveness, concerning the advance-assertiveness trade-off, is expected to improve.
Descrição
Palavras-chave
Empirical hydrological modeling, Flash floods early warnings, Neural networks, Water level forecast
Idioma
Inglês
Citação
Earth Science Informatics, v. 16, n. 4, p. 4313-4326, 2023.





