[Poster] Mathematical Properties of Continuous Ranked Probability Score Forecasting

Date:

Poster of a work conducted with Clément Dombry, Philippe Naveau and Maxime Taillardat. The article associated to this work can be found here.

Abstract The theoretical advances on the properties of scoring rules over the past decades have broaden the use of scoring rules in probabilistic forecasting. In meteorological forecasting, statistical postprocessing techniques are essential to improve the forecasts made by deterministic physical models. Numerous state-of-the-art statistical postprocessing techniques are based on distributional regression evaluated with the Continuous Ranked Probability Score (CRPS). However, theoretical properties of such minimization of the CRPS have mostly considered the unconditional framework (i.e. without covariables) and infinite sample sizes. We circumvent these limitations and study the rate of convergence in terms of CRPS of distributional regression methods. We find the optimal minimax rate of convergence for a given class of distributions. Moreover, we show that the nearest neighbor method and the kernel method for distributional regression reach the optimal rate of convergence in dimension larger than 2 and in any dimension, respectively. Associated article: The theoretical advances on the properties of scoring rules over the past decades have broaden the use of scoring rules in probabilistic forecasting. In meteorological forecasting, statistical postprocessing techniques are essential to improve the forecasts made by deterministic physical models. Numerous state-of-the-art statistical postprocessing techniques are based on distributional regression evaluated with the Continuous Ranked Probability Score (CRPS). However, theoretical properties of such minimization of the CRPS have mostly considered the unconditional framework (i.e. without covariables) and infinite sample sizes. We circumvent these limitations and study the rate of convergence in terms of CRPS of distributional regression methods. We find the optimal minimax rate of convergence for a given class of distributions. Moreover, we show that the nearest neighbor method and the kernel method for distributional regression reach the optimal rate of convergence in dimension larger than 2 and in any dimension, respectively.

This poster was presented at the Session NP5.1 - EGU 23.

Support of the presentation : Download Associated article : Distributional regression and its evaluation with the CRPS: Bounds and convergence of the minimax risk, Pic et al. (2022) link