Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Paper] GenCast Diffusion model for weather forecasting #80

Open
jacobbieker opened this issue Dec 28, 2023 · 9 comments
Open

[Paper] GenCast Diffusion model for weather forecasting #80

jacobbieker opened this issue Dec 28, 2023 · 9 comments
Labels
enhancement New feature or request help wanted Extra attention is needed

Comments

@jacobbieker
Copy link
Member

Arxiv/Blog/Paper Link

https://arxiv.org/abs/2312.15796

Detailed Description

A diffusion-based approach to weather forecasting, that is quite stable autoregressively. Probably fits in more with diffusion_weather but here to be with the other weather papers. They also say the model is based on GraphCast, but with a different graph connectivity, and a sparse transformer instead of the Processor GNN in GraphCast.

Context

Cool way of doing ensemble predictions, scalable, only at a 1 degree resolution though, 12 hour timesteps as well, so less than Graphcast and the like for some reason?

@jacobbieker jacobbieker added the enhancement New feature or request label Dec 28, 2023
@jacobbieker
Copy link
Member Author

The encoder and decoder are the same as GraphCast, but the latent grid is a 5-refined mesh, not a multi-mesh, with 10242 nodes and 61440 edges.

image

@jacobbieker
Copy link
Member Author

I think this adds more support to modularizing graph_weather, what is being done in #76, so that it is easier to experiment with/replicate this kind of result

@jacobbieker
Copy link
Member Author

They train on the 12 hour timestep to be in different data assimilation windows, as ERA5 only has 2 a day.

@jacobbieker
Copy link
Member Author

Overall, really impressive results I think. Interesting combination of graph and diffusion model. A lot slower to run and train because of how diffusion models work, and still requires NWP analysis field for initialization. Compared to a few seconds for a 0.25 degree forecast with GraphCast, 1min per forecast with GenCast at 1 degree is a lot slower. But the better results, and it still be a lot faster than traditional methods make it quite interesting.

@jacobbieker jacobbieker added good first issue Good for newcomers help wanted Extra attention is needed labels Dec 28, 2023
@jacobbieker
Copy link
Member Author

Would be really keen to implement this here.

@aavashsubedi
Copy link
Contributor

Seems like an interesting project. Is this open as a GSOC project? And if so, what would the scope/length be? (I don't see GraphCast in the repo, so I imagine that would also need to be ported over from DM ?)

@jacobbieker
Copy link
Member Author

Yes, this could work as a GSoC project. It would be a large project (350h). GraphCast wouldn't need to be ported over, we already have the encoder/decoder graph networks implemented, although they definitely can be improved! It would be more the diffusion model that would need to be added. And some changes to make the code more modular, so that we can easily swap out Encoders/Processor/Decoders easily.

@jacobbieker
Copy link
Member Author

They have released an updated paper now, increasing the resolution to 0.25 degrees, and more comparison against ENS, which they claim to beat in 97 percent of the time.

@gbruno16
Copy link
Contributor

This is very cool! Also, it seems that you don't need to restart training from scratch to go from 1 degree to 0.25: just a few small modifications and fine-tuning are enough, similar to graph weather!

@Sukh-P Sukh-P removed the good first issue Good for newcomers label May 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

4 participants