You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When running inference on the model.onnx with the proper format, the result (with the same input) is different from the TensorFlow model. This is the system information:
Hello,
I have a binary classifier in TensorFlow and I converted it to ONNX using tf2onnx with the following command:
python -m tf2onnx.convert --saved-model C:\example_path\pb_format --output C:\example_path\model.onnx --opset 17
When running inference on the model.onnx with the proper format, the result (with the same input) is different from the TensorFlow model. This is the system information:
TensorFlow Version: 2.9.0
Keras: 2.9.0
Python version: 3.9.0
tf2onnx: '1.16.1'
I have also tried converting the model to ONNX using this other method. The inference results are different from the other two models...
Have you found the reason or any solution?
Thanks,
Martin
The text was updated successfully, but these errors were encountered: