DLWP: Deep Learning Weather Prediction

Overview

DLWP: Deep Learning Weather Prediction

DLWP is a Python project containing data-processing and model-building tools for predicting the gridded atmosphere using deep convolutional neural networks.

Reference

If you use this code or find it useful please cite our publication!

Getting started

For now, DLWP is not a package that can be installed using pip or a setup.py file, so it works like most research code: download (or checkout) and run.

Required dependencies

It is assumed that the following are installed using Anaconda Python 3 (Python 2.7 is supported).

  • TensorFlow (GPU capable version highly recommended). The conda package, while not the recommended installation method, is easy and also installs the required CUDA dependencies. For best performance, follow the instructions for installing from source.
    conda install tensorflow-gpu
  • Keras
    pip install keras
  • netCDF4
    conda install netCDF4
  • xarray
    conda install dask xarray

Optional dependencies

The following are required only for some of the DLWP features:

  • PyTorch: for torch-based deep learning models. Again the GPU-ready version is recommended.
    pip install torch torchvision
  • scikit-learn: for machine learning pre-processing tools such as Scalers and Imputers
    conda install scikit-learn
  • scipy: for CFS data interpolation
  • pygrib: for raw CFS data processing
    pip install pygrib
  • cdsapi: for retrieval of ERA5 data
    pip install cdsapi
  • pyspharm: spherical harmonics transforms for the barotropic model
    conda install -c conda-forge pyspharm

Quick overview

General framework

DLWP is built as a weather forecasting model that can, should performance improve greatly, "replace" and existing global weather or climate model. Essentially, this means that DLWP uses a deep convolutional neural network to map the state of the atmosphere at one time to the entire state of the atmophere at the next available time. A continuous forecast can then be made by feeding the model's predicted state back in as inputs, producing indefinite forecasts.

Data processing

The classes in DLWP.data provide tools for retrieving and processing raw data from the CFS reanalysis and reforecast and the ERA5 reanalysis. Meanwhile, the DLWP.model.preprocessing module provides tools for formatting the data for ingestion into the deep learning models. The following examples retrieve and process data from the CFS reanalysis:

  • examples/write_cfs.py
  • examples/write_cfs_predictors.py

The resulting file of predictor data can be ingested into the data generators for the models.

Keras models

The DLWP.model module contains classes for building and training Keras and PyTorch models. The DLWPNeuralNet class is essentially a wrapper for the simple Keras Sequential model, adding optional run-time scaling and imputing of data. It implements a few key methods:

  • build_model: use a custom API to assemble layers in a Sequential model. Also implements models running on multiple GPUs.
  • fit: scale the data and fit the model
  • fit_generator: use the Keras fit_generator method along with a custom data generator (see section below)
  • predict: predict with the model
  • predict_timeseries: predict a continuous time series forecast, where the output of one prediction iteration is used as the input for the next

An example of a model built and trained with the DLWP APIs using data generated by the DLWP processing methods, see examples/train.py.

DLWP also implements a DLWPFunctional class which implements the same methods as the DLWPNeuralNet class but takes as input to build_model a model assembled using the Keras functional API. For an example of training a functional model, see examples/train_functional.py.

PyTorch models

Currently, due to a focus on TensorFlow/Keras models, the PyTorch implementation in DLWP is more limited, although still robust. Like the Keras models, it implements a convenient build_model method to assemble a sequential-like model using the same API parameters as those for DLWPNeuralNet. Additionally, it also implements a fit method to automatically iterate through the data and optimizer, again, just like the Keras API.

The PyTorch example, train_torch.py, is somewhat outdated and uses the spherical convolution library s2cnn. This method has yet to produce good results.

Custom layers and functions

The DLWP.custom module contains many custom layers specifically for applying convolutional neural networks to the global weather prediction problem. For example, PeriodicPadding2D implements periodic boundary conditions for padding data in space prior to applying convolutions. These custom layers are worth a look.

Data generators

DLWP.model.generators contains several classes for generating data on-the-fly from a netCDF file produced by the DLWP preprocessing methods. These data generators can then be used in conjunction with a DWLP model instance's fit_generator method.

  • The DataGenerator class is the simplest generator class. It merely returns batches of data from a file containing "predictors" and "targets" variables already formatted for use in the DLWP model. Due to this simplicity, this is the optimal way to generate data directly from the disk when system memory is not sufficient to load the entire dataset. However, this comes at the cost of generating very large files on disk with redundant data (since the targets are merely a different time shift of the predictors).
  • The SeriesDataGenerator class is much more robust and memory efficient. It expects only a single "predictors" variable in the input file and generates predictor-target pairs on the fly for each batch of data. It also has the ability to prescribe external fields such as incoming solar radiation.
  • The SmartDataGenerator is deprecated in favor of SeriesDataGenerator.

Advanced forecast tools

The DLWP.model module also contains a TimeSeriesEstimator class. This class can be used to make robust forward forecasts where the data input does not necessarily match the data output of a model. And example usage of this class is in examples/validate.py, which performs basic routines to validate the forecast skill of DLWP models.

Other

The DLWP.util module contains useful utilities, including save_model and load_model for saving and loading DLWP models (and correctly dealing with multi-GPU models).

Owner
Kushal Shingote
Android Developer📱📱 iOS Apps📱📱 Swift | Xcode | SwiftUI iOS Swift development📱 Kotlin Application📱📱 iOS📱 Artificial Intelligence 💻 Data science
Kushal Shingote
TorchIO is a Medical image preprocessing and augmentation toolkit for deep learning. Part of the PyTorch Ecosystem.

Medical image preprocessing and augmentation toolkit for deep learning. Part of the PyTorch Ecosystem.

Fernando Pérez-García 1.6k Jan 06, 2023
This repo provides the base code for pytorch-lightning and weight and biases simultaneous integration.

Write your model faster with pytorch-lightning-wadb-code-backbone This repository provides the base code for pytorch-lightning and weight and biases s

9 Mar 29, 2022
QAT(quantize aware training) for classification with MQBench

MQBench Quantization Aware Training with PyTorch I am using MQBench(Model Quantization Benchmark)(http://mqbench.tech/) to quantize the model for depl

Ling Zhang 29 Nov 18, 2022
Wordle Env: A Daily Word Environment for Reinforcement Learning

Wordle Env: A Daily Word Environment for Reinforcement Learning Setup Steps: git pull [email&#

2 Mar 28, 2022
Sequence-tagging using deep learning

Classification using Deep Learning Requirements PyTorch version = 1.9.1+cu111 Python version = 3.8.10 PyTorch-Lightning version = 1.4.9 Huggingface

Vineet Kumar 2 Dec 20, 2022
PolyGlot, a fuzzing framework for language processors

PolyGlot, a fuzzing framework for language processors Build We tested PolyGlot on Ubuntu 18.04. Get the source code: git clone https://github.com/s3te

Software Systems Security Team at Penn State University 79 Dec 27, 2022
[ICRA 2022] CaTGrasp: Learning Category-Level Task-Relevant Grasping in Clutter from Simulation

This is the official implementation of our paper: Bowen Wen, Wenzhao Lian, Kostas Bekris, and Stefan Schaal. "CaTGrasp: Learning Category-Level Task-R

Bowen Wen 199 Jan 04, 2023
:hot_pepper: R²SQL: "Dynamic Hybrid Relation Network for Cross-Domain Context-Dependent Semantic Parsing." (AAAI 2021)

R²SQL The PyTorch implementation of paper Dynamic Hybrid Relation Network for Cross-Domain Context-Dependent Semantic Parsing. (AAAI 2021) Requirement

huybery 60 Dec 31, 2022
Pytorch implementation for "Density-aware Chamfer Distance as a Comprehensive Metric for Point Cloud Completion" (NeurIPS 2021)

Density-aware Chamfer Distance This repository contains the official PyTorch implementation of our paper: Density-aware Chamfer Distance as a Comprehe

Tong WU 93 Dec 15, 2022
PSANet: Point-wise Spatial Attention Network for Scene Parsing, ECCV2018.

PSANet: Point-wise Spatial Attention Network for Scene Parsing (in construction) by Hengshuang Zhao*, Yi Zhang*, Shu Liu, Jianping Shi, Chen Change Lo

Hengshuang Zhao 217 Oct 30, 2022
An Official Repo of CVPR '20 "MSeg: A Composite Dataset for Multi-Domain Segmentation"

This is the code for the paper: MSeg: A Composite Dataset for Multi-domain Semantic Segmentation (CVPR 2020, Official Repo) [CVPR PDF] [Journal PDF] J

226 Nov 05, 2022
Normalization Calibration (NorCal) for Long-Tailed Object Detection and Instance Segmentation

NorCal Normalization Calibration (NorCal) for Long-Tailed Object Detection and Instance Segmentation On Model Calibration for Long-Tailed Object Detec

Tai-Yu (Daniel) Pan 24 Dec 25, 2022
Rotation Robust Descriptors

RoRD Rotation-Robust Descriptors and Orthographic Views for Local Feature Matching Project Page | Paper link Evaluation and Datasets MMA : Training on

Udit Singh Parihar 25 Nov 15, 2022
Balancing Principle for Unsupervised Domain Adaptation

Blancing Principle for Domain Adaptation NeurIPS 2021 Paper Abstract We address the unsolved algorithm design problem of choosing a justified regulari

Marius-Constantin Dinu 4 Dec 15, 2022
Train SN-GAN with AdaBelief

SNGAN-AdaBelief Train a state-of-the-art spectral normalization GAN with AdaBelief https://github.com/juntang-zhuang/Adabelief-Optimizer Acknowledgeme

Juntang Zhuang 10 Jun 11, 2022
Paper: Cross-View Kernel Similarity Metric Learning Using Pairwise Constraints for Person Re-identification

Cross-View Kernel Similarity Metric Learning Using Pairwise Constraints for Person Re-identification T M Feroz Ali, Subhasis Chaudhuri, ICVGIP-20-21

T M Feroz Ali 3 Jun 17, 2022
Official code for ICCV2021 paper "M3D-VTON: A Monocular-to-3D Virtual Try-on Network"

M3D-VTON: A Monocular-to-3D Virtual Try-On Network Official code for ICCV2021 paper "M3D-VTON: A Monocular-to-3D Virtual Try-on Network" Paper | Suppl

109 Dec 29, 2022
Code for ECIR'20 paper Diagnosing BERT with Retrieval Heuristics

Bert Axioms This is the repository with the code for the Paper Diagnosing BERT with Retrieval Heuristics Required Data In order to run this code, you

Arthur Câmara 5 Jan 21, 2022
DISTIL: Deep dIverSified inTeractIve Learning.

DISTIL: Deep dIverSified inTeractIve Learning. An active/inter-active learning library built on py-torch for reducing labeling costs.

decile-team 110 Dec 06, 2022
A generalist algorithm for cell and nucleus segmentation.

Cellpose | A generalist algorithm for cell and nucleus segmentation. Cellpose was written by Carsen Stringer and Marius Pachitariu. To learn about Cel

MouseLand 733 Dec 29, 2022