TY - GEN
T1 - PROBABILISTIC CONFORMAL PREDICTION WITH APPROXIMATE CONDITIONAL VALIDITY
AU - Plassier, Vincent
AU - Fishkov, Alexander
AU - Guizani, Mohsen
AU - Panov, Maxim
AU - Moulines, Eric
N1 - Publisher Copyright:
© 2025 13th International Conference on Learning Representations, ICLR 2025. All rights reserved.
PY - 2025/1/1
Y1 - 2025/1/1
N2 - We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution PY |X. Existing methods, such as conformalized quantile regression and probabilistic conformal prediction, usually provide only a marginal coverage guarantee. In contrast, our approach extends these frameworks to achieve approximate conditional coverage, which is crucial for many practical applications. Our prediction sets adapt to the behavior of the predictive distribution, making them effective even under high heteroscedasticity. While exact conditional guarantees are infeasible without assumptions on the underlying data distribution, we derive non-asymptotic bounds that depend on the total variation distance between the conditional distribution and its estimate. Using extensive simulations, we show that our method consistently outperforms existing approaches in terms of conditional coverage, leading to more reliable statistical inference in a variety of applications.
AB - We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution PY |X. Existing methods, such as conformalized quantile regression and probabilistic conformal prediction, usually provide only a marginal coverage guarantee. In contrast, our approach extends these frameworks to achieve approximate conditional coverage, which is crucial for many practical applications. Our prediction sets adapt to the behavior of the predictive distribution, making them effective even under high heteroscedasticity. While exact conditional guarantees are infeasible without assumptions on the underlying data distribution, we derive non-asymptotic bounds that depend on the total variation distance between the conditional distribution and its estimate. Using extensive simulations, we show that our method consistently outperforms existing approaches in terms of conditional coverage, leading to more reliable statistical inference in a variety of applications.
UR - https://www.scopus.com/pages/publications/105010182533
M3 - Conference contribution
AN - SCOPUS:105010182533
T3 - 13th International Conference on Learning Representations, ICLR 2025
SP - 61272
EP - 61304
BT - 13th International Conference on Learning Representations, ICLR 2025
PB - International Conference on Learning Representations, ICLR
T2 - 13th International Conference on Learning Representations, ICLR 2025
Y2 - 24 April 2025 through 28 April 2025
ER -