Skip to content

Latest commit

 

History

History
115 lines (86 loc) · 6.27 KB

README.md

File metadata and controls

115 lines (86 loc) · 6.27 KB

CasCast: Skillful High-resolution Precipitation Nowcasting via Cascaded Modelling (ICML 2024)
Official PyTorch Implementation

CasCast sample

This repo contains PyTorch model definitions, pre-trained weights and training/sampling code for our paper exploring decoupling the precipitation nowcasting with deterministic part and probabilistic part (diffusion models).

CasCast: Skillful High-resolution Precipitation Nowcasting via Cascaded Modelling
Junchao Gong, Lei Bai, etc.
Shanghai Jiao Tong University, Shanghai AI Laboratory

Two key challenges of precipitation nowcasting are not well-solved: (i) the modeling of complex precipitation system evolutions with different scales, and (ii) accurate forecasts for extreme precipitation. We propose CasCast, a cascaded framework composed of a deterministic and a probabilistic part to decouple the predictions for mesoscale precipitation distributions and small-scale patterns. We train a deterministic model in pixel space and a DiT-based probabilistic model in latent space to model the complex precipitation system evolutions with different scales.

Cascaded pipeline

This repository contains:

Setup

First, download and set up the repo:

git clone https://github.com/OpenEarthLab/CasCast.git
cd CasCast

Deploy the environment given below:

python version 3.9.17
torch==2.0.1+cu118

Inferencing

You can inference from our pretrained probabilistic model with our preprocessed deterministic predictions. First, download the pretrained probabilistic model into cascast/experiments/cascast_diffusion/world_size1-ckpt/checkpoint_best.pth. Then, download the preprocessed(compressed into latent space) predictions into latent_data and unzip it. Finally, download the checkpoint of the autoencoder as ckpts/autoencoder/ckpt.pth.

bash ./scripts/eval_diffusion_infer.sh

Training

step1. Training the deterministic part

preprocess SEVIR dataset

Set the data_dir in configs/sevir_used/EarthFormer.yaml as the sevir path you use, such as pixel_data/sevir. Training deterministic model EarthFormer by

bash ./scripts/train_deterministic.sh

Evaluating deterministic model EarthFormer by

bash ./scripts/eval_deterministic.sh

step2. Training the autoencoder part

We use an autoencoder to compress samples into latent space. Training the autoencoder by

bash ./scripts/train_autoencoder.sh

step3. Training the probabilistic part

preprocess samples and predictions of the deterministic part

To speed up the training pipeline of the probabilistic part of CasCast, we compress the ground truth of the future radar echoes and the predictions of the deterministic part into latent space in advance.

Compressing ground truth by

bash ./scripts/compress_gt.sh

Compressing the predictions of the EarthFormer as the deterministic part

bash ./scripts/compress_earthformer.sh

The compressed data are saved in latent_data/sevir_latent/48x48x4.

Training the diffusion model in latent space.

Set data_dir, latent_gt_dir, and latent_deterministic_dir in configs/sevir_used/cascast_diffusion.yamlsevir_used/cascast_diffusion.yaml. data_dir is the path to sevir used in step Training the deterministic part. latent_gt_dir is the path where compressed gt saved in, and latent_deterministic_dir is the path where compressed predictions saved in.

bash ./scripts/train_diffusion.sh

Evaluation

Evaluating the deterministic model

bash ./scripts/eval_deterministic.sh

Evaluating the diffusion model

bash ./scripts/eval_diffusion.sh

BibTeX

@article{gong2024cascast,
  title={CasCast: Skillful High-resolution Precipitation Nowcasting via Cascaded Modelling},
  author={Gong, Junchao and Bai, Lei and Ye, Peng and Xu, Wanghan and Liu, Na and Dai, Jianhua and Yang, Xiaokang and Ouyang, Wanli},
  journal={arXiv preprint arXiv:2402.04290},
  year={2024}
}