title | abstract | openreview | software | section | layout | series | publisher | issn | id | month | tex_title | firstpage | lastpage | page | order | cycles | bibtex_author | author | date | address | container-title | volume | genre | issued | extras | ||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Uncertainty Estimation with Recursive Feature Machines |
In conventional regression analysis, predictions are typically represented as point estimates derived from covariates. The Gaussian Process (GP) offer a kernel-based framework that predicts and quantifies associated uncertainties. However, kernel-based methods often underperform ensemble-based decision tree approaches in regression tasks involving tabular and categorical data. Recently, Recursive Feature Machines (RFMs) were proposed as a novel feature-learning kernel which strengthens the capabilities of kernel machines. In this study, we harness the power of these RFMs in a probabilistic GP-based approach to enhance uncertainty estimation through feature extraction within kernel methods. We employ this learned kernel for in-depth uncertainty analysis. On tabular datasets, our RFM-based method surpasses other leading uncertainty estimation techniques, including NGBoost and CatBoost-ensemble. Additionally, when assessing out-of-distribution performance, we found that boosting-based methods are surpassed by our RFM-based approach. |
TBKLXswKnO |
Papers |
inproceedings |
Proceedings of Machine Learning Research |
PMLR |
2640-3498 |
gedon24a |
0 |
Uncertainty Estimation with Recursive Feature Machines |
1408 |
1437 |
1408-1437 |
1408 |
false |
Gedon, Daniel and Abedsoltan, Amirhesam and Sch\"on, Thomas B. and Belkin, Mikhail |
|
2024-09-12 |
Proceedings of the Fortieth Conference on Uncertainty in Artificial Intelligence |
244 |
inproceedings |
|