Skip to content

NTuan-Nguyen/TN_LSTM_rainfall_or610

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Rainfall Prediction Using LSTM Deep Learning Model

This project was created as the final project for OR610 - Deep Learning course. The purpose of this project is to create a deep learning Long-short Term Memory model using PyTorch to predict future rainfall events. Initial version of the project was created in Google Collab environment and was modified to work in local environment. Details of this projects are included in the report located in /Report.

Abstract

In order to bridge the gap between research on short term rainfall predictions (up to 14 days) and long term predictions without compromising temporal information of precipitation events, this paper proposed using LSTM models to predict day by day rainfall amount in millimeters up to 30 days future period using data gathered in period from 2007 2017 from 21 stations across Australia. Two models’ variants were created, the Daily Iterative model which use prior sequence to predict 1 day ahead, which can be used iteratively to predict long term rainfalls, and the Single Prediction model where the model output 30 days future rainfall sequence at once. Comparison with 0 predictors and random weighted model showed that both model variants were able to capture some patterns between historical record data and future rainfall amount. The Daily Iterative model tends to overpredict rainfall amount but underpredict the occurrence of rainfall events. In contrast, the Single Prediction model makes a lot of small rainfall predictions, even during days which no rainfall events occurred. Both models exhibit limitations which would be possible to address using more advanced model architecture such as modified LSTM or Transformer.

About

LSTM model using Pytorch to predict future rainfall

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published