This repository implements our ICML2021 paper: Meta-Cal: Well-controlled Post-hoc Calibration by Ranking.
git clone https://github.com/maxc01/metacal
cd metacal
poetry install
See the test code.
cd metacal/tests/
python test_metacal.py
An example on MNIST dataset can be found in examples
folder. To generate the
logits, the mnist training script (copied and adapted from the official
pytorch repo) can be run using the following commands:
cd metacal/examples
python main.py
I already uploaded the generated logits in the same folder. To run MetaCal:
python test_metacal.py --target_acc 0.999
Another usage of MetaCal is that it can remove ambiguous examplers, to show this, I made two plots comparing the TSNE of the original logits and the logits returned by MetaCal. From the following figure, we can see MetaCal produces a very clean separation among 10 classes while the original TSNE has many wrongly located points.
As we’ve mentioned in the main paper:
To ensure the independence assumption, the training data of Meta-Cal should be different from the data set used to train the multi-class classifier.
@inproceedings{Ma2021a,
TITLE = {Meta-Cal: Well-controlled Post-hoc Calibration by Ranking},
AUTHOR = {Ma, Xingchen and Blaschko, Matthew B.},
BOOKTITLE = {International Conference on Machine Learning},
YEAR = {2021},
}