You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What is the proper way to convert onnx models to stablehlo?
I'm encountering issues converting ONNX models from the model zoo to StableHLO using onnx-mlir. While some models convert partially, others fail completely.
But I get partial stablehlo conversion. onnx.conv, onnx.QuantizeLinear and onnx.DequantizeLinear remain in the stablehlo output.
After checking the list of supported ops and the opsets used by the model zoo for these models. It seems the operations in question should be supported.
@brataTT Sorry we don't use this path much at IBM. However, we would welcome to add a target like -EmitStableHLOIR that would be a standard driver. If you get the proper sequence of onnx-mlir-opt, we can do that work.
Thank you for the response @AlexandreEichenberger
Are there any docs I can check to help understand what is currently possible/supported?
I'm checking SupportedONNXOps-cpu.md to figure out which ops I can and cannot expect to be supported, but that didn't help me.
My best suggestion would be to look at recent StableHLO PRs and ping the authors, as I have no visibility on how they use onnx-mlir with drivers outside of this repo.
I'm checking SupportedONNXOps-cpu.md to figure out which ops I can and cannot expect to be supported, but that didn't help me.
That file has the ops that we support when going all the way down to binary, StableHLO is a different path and the support provided essentially by the conversion of ONNX ops to StableHLO does not correlate with what is lowered to binary.
What is the proper way to convert onnx models to stablehlo?
I'm encountering issues converting ONNX models from the model zoo to StableHLO using
onnx-mlir
. While some models convert partially, others fail completely.I have tried the following commands:
But I get partial stablehlo conversion.
onnx.conv
,onnx.QuantizeLinear
andonnx.DequantizeLinear
remain in the stablehlo output.After checking the list of supported ops and the opsets used by the model zoo for these models. It seems the operations in question should be supported.
Example outputs I get for resnet models
resnet18-v1-7.onnx:
wget https://github.com/onnx/models/raw/refs/heads/main/validated/vision/classification/resnet/model/resnet18-v1-7.onnx onnx-mlir --EmitONNXIR resnet18-v1-7.onnx onnx-mlir-opt --shape-inference --convert-onnx-to-stablehlo resnet18-v1-7.onnx.mlir > shlo.mlir
onnx.conv
is still present in the stablehlo outputresnet50-v1-12-qdq.onnx:
wget https://github.com/onnx/models/raw/refs/heads/main/validated/vision/classification/resnet/model/resnet50-v1-12-qdq.onnx onnx-mlir --EmitONNXIR resnet50-v1-12-qdq.onnx onnx-mlir-opt --shape-inference --convert-onnx-to-stablehlo resnet50-v1-12-qdq.onnx.mlir > shlo.mlir
onnx.QuantizeLinear
andonnx.DequantizeLinear
are still present in the stablehlo outputresnet50-v1-12-int8.onnx:
wget https://github.com/onnx/models/raw/refs/heads/main/validated/vision/classification/resnet/model/resnet50-v1-12-int8.onnx onnx-mlir --EmitONNXIR resnet50-v1-12-int8.onnx onnx-mlir-opt --shape-inference --convert-onnx-to-stablehlo resnet50-v1-12-int8.onnx.mlir > shlo.mlir
The model fails to convert to stablehlo with the following error:
The text was updated successfully, but these errors were encountered: