You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As the title suggests, I tried instantiating an SaT model with an already downloaded ONNX model as I'm interested in using it in offline mode. This works, but doing inference with the split method will raise an error:
File "myvenv/lib/python3.11/site-packages/wtpsplit/__init__.py", line 804, in _split
default_threshold = get_default_threshold(self.model_name_or_model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "myvenv/lib/python3.11/site-packages/wtpsplit/__init__.py", line 796, in get_default_threshold
if"sm"in model_str:
^^^^^^^^^^^^^^^^^
TypeError: argument of type'InferenceSession' is not iterable
I presume the get_default_threshold method expects a string, however this is currently not supported.
I've managed to get it working by cloning the repository from the HuggingFace hub. If this is the intended way of using models in offline mode, please briefly describe it in the documentation.
Thanks for the great work!
The text was updated successfully, but these errors were encountered:
Hi,
As the title suggests, I tried instantiating an
SaT
model with an already downloaded ONNX model as I'm interested in using it in offline mode. This works, but doing inference with thesplit
method will raise an error:I presume the
get_default_threshold
method expects a string, however this is currently not supported.I've managed to get it working by cloning the repository from the HuggingFace hub. If this is the intended way of using models in offline mode, please briefly describe it in the documentation.
Thanks for the great work!
The text was updated successfully, but these errors were encountered: