This is an implementation of Painting Outside the Box: Image Outpainting paper from Standford University. Some changes have been made to work with 256*256 image:
- Added Identity loss i.e from generated image to the original image
- Removed patches from training data. (training pipeline)
- Replaced masking with cropping. (training pipeline)
- Added convolution layers.
The model was train with 3500 scrapped beach data with agumentation totalling upto 10500 images for 25 epochs.
sudo apt-get install curl
sudo pip3 install -r requirements.txt
- Prepare Data:
# Downloads the beach data and converts to numpy batch data # saves the Numpy batch data to 'data/prepared_data/' sh prepare_data.sh
- Build Model
- To build Model from scratch you can directly run 'outpaint.ipynb'
OR - You can Download my trained model and move it to 'checkpoint/' and run it.
- To build Model from scratch you can directly run 'outpaint.ipynb'