53e Journées de Statistique

Date:

Talk in French for the 53e Journées de Statistique de la SFDS, organized in Lyon (France).

Title. Minimax Rate of Convergence for Distributional Regression using the Continuous Ranked Probability Score

Abstract. The distributional regression fulfills a fundamental need of statistical analysis : being able to make forecasts and quantify their uncertainty. This approach overcomes the limits of classical regression which estimates only the conditional mean by estimation the whole distribution law. This methodology, called probabilistic forecast, is widely used in numerous fields such as meteorology and energy production, but its theoretical aspects have not been studied. By analogy with the classical theory of statistical learning, we define a framework where the predictor is a law of probability, called prediction law, and where the loss function is given by a strictly proper scoring rule in the sense of Gneiting and Raftery (2007). Bayes predictor is then the conditional law. In the case of the Continuous Ranked Probability Score, we study then the minimax rate of convergence and show that the k nearest neighbor algorithm reaches the optimal rate of convergence in dimension higher or equal to 2 and that the kernel methods reach this optimal rate in any dimension.

Long summary in French : here Associated article : Distributional regression and its evaluation with the CRPS: Bounds and convergence of the minimax risk, Pic et al. (2022) link