Copyright Notice:

The documents distributed by this server have been provided by the contributing authors as a means to ensure timely dissemination of scholarly and technical work on a noncommercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, notwithstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Publications of SPCL

L. Huang, T. Hoefler:

 Compressing multidimensional weather and climate data into neural networks

(In The Eleventh International Conference on Learning Representations, May 2023)
Notable Top 5% (Oral)

Abstract

Weather and climate simulations produce petabytes of high-resolution data that are later analyzed by researchers in order to understand climate change or severe weather. We propose a new method of compressing this multidimensional weather and climate data: a coordinate-based neural network is trained to overfit the data, and the resulting parameters are taken as a compact representation of the original grid-based data. While compression ratios range from 300x to more than 3,000x, our method outperforms the state-of-the-art compressor SZ3 in terms of weighted RMSE, MAE. It can faithfully preserve important large scale atmosphere structures and does not introduce artifacts. When using the resulting neural network as a 790x compressed dataloader to train the WeatherBench forecasting model, its RMSE increases by less than 2%. The three orders of magnitude compression democratizes access to high-resolution climate data and enables numerous new research directions.

Documents

download article:
access preprint on arxiv:
download slides:


Recorded talk (best effort)

 

BibTeX

@inproceedings{,
  author={Langwen Huang and Torsten Hoefler},
  title={{Compressing multidimensional weather and climate data into neural networks}},
  year={2023},
  month={05},
  booktitle={The Eleventh International Conference on Learning Representations},
  doi={10.48550/arXiv.2210.12538},
}