Fun with OpenAI's GPT-2
Note: You may need Colab pro to run these notebooks due to the large amount of memory needed to run each cell without error. Decreasing variables such as batch_size
, vocab_size
, n_embd
, max_length
, n_layer
, etc. aren't really recommended but will decrease memory consumption therefore allowing you to run this notebook on Colab's Standard
GPU
Runtime without needing Colab Pro to use the High Mem
GPU
runtime.
- Colaboratory: https://colab.research.google.com/github/brianlechthaler/GPT-2/blob/origin/GPT_2_Kaggle1MHeadlines.ipynb
- Colaboratory: https://colab.research.google.com/github/brianlechthaler/GPT-2/blob/origin/GPT_2_arXiv.ipynb
- Colaboratory: https://colab.research.google.com/github/brianlechthaler/GPT-2/blob/origin/GPT_2_Kickstarter_Project_Names.ipynb
- Colaboratory Link: https://colab.research.google.com/github/brianlechthaler/GPT-2/blob/origin/GPT_2_Anime.ipynb
- Colaboratory: https://colab.research.google.com/github/brianlechthaler/GPT-2/blob/origin/GPT_2_MediumTitles.ipynb
This project was made possible by the cumulative efforts of the following parties:
Brian Lechthaler author of these notebooks
Max Woolf author of aitextgen, the training code these notebooks are based on.
OpenAI creators of GPT-2 model