-
Notifications
You must be signed in to change notification settings - Fork 236
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fantastic Things has Fantastic Errors #387
Comments
Thanks for labeling, As a reminder I should add that The kaggle has a Direct Model Loading Section, means we can load the model, Fix and Edit it, And Save it into our working directory, and Re upload it for further use on the kaggle itself. |
It seems that the As to the Kaggle's TPU, this would take some time as we haven't tested our code on TPU environment. |
hi buddy, have you solved your issue yet? |
Hi, Actually not. Still trying to find a memory efficient way to using it on Kaggles TPU. Do you have some good news? |
hi Guys, Any update? |
Not much, It's also the my first time to try TPU. |
I have some findings and need to discuss them with you, I searched and couldn't find any contact. here is my Email |
Sorry for being late, you can find me via [email protected] |
Hi, Your framework was the exact thing I needed, But I cant running it on Colab. Many Errors, Too many library and package mismatches and so on...
If there is Running Colab Version which could be run correctly and been able to edit Mistral, That would be nice.
But a question?
Why we couldn't use Kaggle's TPU with above 300GB of ram and much faster?
Is there any solution to run it on Kaggle at all?
Best Wishes and thanks again
The text was updated successfully, but these errors were encountered: