You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, not a bug, but I tried to run this on AWS with 16G of memory on a Tesla V100-SXM2-16GB GPU and I couldn't load the model. It ran out of memory. Anyone know what might be going on? I ran it in colab just fine (at least loading the model) on 16GB memory. Thanks!
.
The text was updated successfully, but these errors were encountered:
Hi, not a bug, but I tried to run this on AWS with 16G of memory on a Tesla V100-SXM2-16GB GPU and I couldn't load the model. It ran out of memory. Anyone know what might be going on? I ran it in colab just fine (at least loading the model) on 16GB memory. Thanks!
.
The text was updated successfully, but these errors were encountered: