Home Page

Papers

Submissions

News

Editorial Board

Special Issues

Open Source Software

Proceedings (PMLR)

Data (DMLR)

Transactions (TMLR)

Search

Statistics

Login

Frequently Asked Questions

Contact Us



RSS Feed

Extending Temperature Scaling with Homogenizing Maps

Christopher Qian, Feng Liang, Jason Adams; 26(161):1−46, 2025.

Abstract

As machine learning models continue to grow more complex, poor calibration significantly limits the reliability of their predictions. Temperature scaling learns a single temperature parameter to scale the output logits, and despite its simplicity, remains one of the most effective post-hoc recalibration methods. We identify one of temperature scaling's defining attributes, that it increases the uncertainty of the predictions in a manner that we term homogenization, and propose to learn the optimal recalibration mapping from a larger class of functions that satisfies this property. We demonstrate the advantage of our method over temperature scaling in both calibration and out-of-distribution detection. Additionally, we extend our methodology and experimental evaluation to recalibration in the Bayesian setting.

[abs][pdf][bib]        [code]
© JMLR 2025. (edit, beta)

Mastodon