Skip to content

Estimating barycenters of distributions by Neural Optimal Transport

License

Notifications You must be signed in to change notification settings

justkolesov/NOTBarycenters

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Estimating Barycenters of Distributions with Neural Optimal Transport

This is the official Python implementation of the ICML 2024 paper Estimating Barycenters of Distributions with Neural Optimal Transport by Alexander Kolesov, Petr Mokrov, Igor Udovichenko, Milena Gazdieva, Gudmund Pammer, Evgeny Burnaev and Alexander Korotin.

Citation

If you find this repository or the ideas presented in our paper useful, please consider citing our paper.

@inproceedings{
    kolesov2024estimating,
    title={Estimating Barycenters of Distributions with Neural Optimal Transport},
    author={Alexander Kolesov and Petr Mokrov and Igor Udovichenko and Milena Gazdieva and Gudmund Pammer and Evgeny Burnaev and Alexander Korotin},
    booktitle={Forty-first International Conference on Machine Learning},
    year={2024},
    url={https://openreview.net/forum?id=ymgcTqrZLT}
}

Pre-requisites

The implementation is GPU-based. Single GPU GTX 1080 ti is enough to run each particular experiment. We tested the code with torch==2.1.1+cu121. The code might not run as intended in older/newer torch versions. Versions of other libraries are specified in requirements.txt. Pre-trained models for maps and potentials are located here.

Repository structure

All the experiments are issued in the form of pretty self-explanatory jupyter notebooks ( stylegan2/notebooks/ ).

  • src/ - auxiliary source code for the experiments: training, plotting, logging, etc.
  • stylegan2/ - folder with auxiliary code for using StyleGAN2.
  • stylegan2/notebooks - jupyter notebooks with evaluation of barycenters on 2D and Image datasets.
  • data/ - folder with datasets.
  • SG2_ckpt/ - folder with checkpoints for StyleGAN2 models.

2-Dimensional estimating barycenters

  • stylegan2/notebooks/twister2D.ipynb -- toy experiments on 2D Twister dataset;
  • stylegan2/notebooks/Gauss2D.ipynb -- evaluating metrics of our method in Gaussian case.

High-Dimensional estimating barycenters of Ave,Celeba! dataset

  • stylegan2/notebooks/AVE_CELEBA_L2.ipynb -- estimating barycenters of Ave,Celeba dataset in Image space ;
  • stylegan2/notebooks/AVE_CELEBA_LATENT.ipynb -- estimating barycenters in latent space with classical cost;
  • stylegan2/notebooks/AVE_CELEBA_ENTROPY.ipynb -- estimating barycenters in latent space with $\epsilon$-KL cost ;
  • stylegan2/notebooks/AVE_CELEBA_KERNEL.ipynb -- estimating barycenters in latent space with $\gamma$-Energy cost;
  • stylegan2/notebooks/AVE_CELEBA_KERNEL_GAUSS.ipynb -- estimating barycenters in latent space with $\gamma$-Energy cost(Gaussian reparam.).

High-Dimensional estimating barycenters of Colored MNIST dataset

  • stylegan2/notebooks/SHAPE_COLOR_EXPERIMENT_ENTROPIC.ipynb -- estimating barycenters in latent space with $\epsilon$-KL cost.

Estimating barycenters of MNIST dataset, digits 0 and 1

  • stylegan2/notebooks/MNIST_01_L2_DATA.ipynb -- estimating barycenters in data space (L2 cost);
  • stylegan2/notebooks/MNIST_01_KERNEL_DATA.ipynb -- estimating barycenters in data space with $\gamma$-Energy cost.

How to Use

  • Download the repository.
git clone https://github.com/justkolesov/NOTBarycenters.git
  • Create virtual environment
pip install -r requirements.txt
  • Download either MNIST or Ave, Celeba! 64x64 dataset.

  • Set downloaded dataset in appropriate subfolder in data/.

  • If you run High-dimensional Image experiment, download appropriate StyleGan2 model from here (folder StyleGan2/).

  • Set downloaded StyleGan2 model in appropriate subfolder in SG2_ckpt/.

  • Run notebook for training or take appropriate checkpoint from here and upload them.

Credits

About

Estimating barycenters of distributions by Neural Optimal Transport

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published