You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Apr 14, 2021. It is now read-only.
Hello i'm currently trying to get the CoQA example running in google colab. Unfortunatly i get a OOM at "train_data = bert_data_helper.convert(train_data,data='coqa')". The colab machines only have 12,7 gb of RAM. When I run the toolkit on my local machine i can see that this process takes up to 14gb of RAM.
My Question is, is it possible to reduce the memory usage of the bert data helper (bert wrapper)? (and if, could you tell me where exactly?)
Thank you in advance
The text was updated successfully, but these errors were encountered:
My3vilMe
changed the title
CoQA in Google Colab
CoQA in Google Colab - OOM (BertWrapper?)
Jul 4, 2019
If it is not about a "GPU" memory issue, one suggestion would be not to extract redundant fields of instances, which the model will not use. I'm not sure whether every extracted field is fully used in the next steps, but I think that checking it could be needed.
A little update. I found how to remove fields from the instances but fortunatly that is not important anymore. After another OOM in Colabs I got a notification if I wand to upgrade to a machine with 26GB RAM, yay!. That pretty much solved the problem and I have been able to make a train run. unfortunatly the 12 hours are not enough to also get the evaluation.
I have another thing I would like to ask thou, concerning the answer prediction.
In the squad example of BERT you can get a prediction.json at the end, is there a easy way to let the toolkit generate a prediction file too?
I also looked into the "get_best_answer" method and couldn't really get it to work (I am rather unexperienced sorry) can you give me some advice there?
Thank you in advance and thanks for making it easy to get into this complex matter! :)
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hello i'm currently trying to get the CoQA example running in google colab. Unfortunatly i get a OOM at "train_data = bert_data_helper.convert(train_data,data='coqa')". The colab machines only have 12,7 gb of RAM. When I run the toolkit on my local machine i can see that this process takes up to 14gb of RAM.
My Question is, is it possible to reduce the memory usage of the bert data helper (bert wrapper)? (and if, could you tell me where exactly?)
Thank you in advance
The text was updated successfully, but these errors were encountered: