Skip to content

TPU enabled Tensorflow implementation of "Large Scale GAN Training for High Fidelity Natural Image Synthesis" (BigGAN)

License

Notifications You must be signed in to change notification settings

AlLongley/BigGAN-TPU-TensorFlow

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BigGAN Tensorflow TPU

Simple Tensorflow TPU implementation of "Large Scale GAN Training for High Fidelity Natural Image Synthesis" (BigGAN)

I (David Mack) have been modifying this network to allow for configuration of its self-attention, to facilitate experiments into the effectiveness of different self-attention architectures.

main

Implementation notes/issues

  • TODO: Implement BigGAN-deep architecture (simpler class embedding, deeper resblock)
  • TODO: Explore whether orthogonal initialization (paper's method) should be used instead of random normal initialization (current implementation)
  • TODO: Implement exponential average parameter/batch norm sampling during prediction and evaluation
  • TODO: Find bug in inception score and implement FID

Usage

Building the data

For ImageNet, use TensorFlow's build scripts to create TFRecord files of your chosen image size (e.g. 128x128). --tfr-format inception

You can also use the data build script from NVidia's Progressive Growing of GANs. --tfr-format progan

Training

You can train on a Google TPU by setting the name of your TPU as an env var and running one of the training scripts. For example,

  • ./launch_train_tpu_sagan.sh --tpu-name node-1

You need to have your training data stored on a Google cloud bucket.

Architecture

128x128

256x256

512x512

Contributing

You're very welcome to! Submit a PR or contact the author(s)

Authors

Junho Kim, David Mack

About

TPU enabled Tensorflow implementation of "Large Scale GAN Training for High Fidelity Natural Image Synthesis" (BigGAN)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 95.5%
  • Shell 4.5%