Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change model saving/loading in Classifier #106

Open
jason-fries opened this issue Dec 1, 2018 · 1 comment
Open

Change model saving/loading in Classifier #106

jason-fries opened this issue Dec 1, 2018 · 1 comment
Assignees
Labels
bug Something isn't working enhancement New feature or request

Comments

@jason-fries
Copy link
Collaborator

jason-fries commented Dec 1, 2018

Currently load and save operate directly over pickles. This causes issues when trying to load models across devices (GPU->CPU). These calls should wrap torch.load and torch. load_state_dict in some configuration based on what use_cuda flag is provided to the model.
See.
https://pytorch.org/tutorials/beginner/saving_loading_models.html#saving-loading-model-across-devices

@jason-fries jason-fries added bug Something isn't working enhancement New feature or request labels Dec 1, 2018
@scottfleming
Copy link

Bumping this, as currently saving/logging models with pickle will fail for any models > 4GB: https://stackoverflow.com/questions/29704139/pickle-in-python3-doesnt-work-for-large-data-saving. This is especially problematic with high-dimensional outputs (e.g. label models that spit out ~1000 different types of labels)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants