-
-
Notifications
You must be signed in to change notification settings - Fork 220
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ONNX Export #30
Comments
A demo script that allows an ONNX export to support That may be dependent on #7 though. |
Rather, that's without |
ONNX export currently deprioritized by lack of Python 3.9 support, which has become increasingly comment (it will likely be added in the next version of |
The latest version of Transformers now explicitly has Python exporting, which removes one of the annoying parts: https://huggingface.co/transformers/serialization.html#onnx-onnxruntime However, a bunch of generation is tied explicitly to PyTorch, so may have to reevaluate that. |
Here is a repo where they combine ONNX model and generation strategies implemented in Transformers - https://github.com/rayhern/convert-gpt2-xl-to-onnx |
ONNX has good
transformers
export and is a point of focus for the ONNX team. ONNX exporting may be more useful/flexible than TorchScript (I have a specific secret use case in mind...)The text was updated successfully, but these errors were encountered: