-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[QNN EP] Re-enable several disabled QNN-EP UTs #23799
base: main
Are you sure you want to change the base?
[QNN EP] Re-enable several disabled QNN-EP UTs #23799
Conversation
### Description 1. Re-enable UTs which passed 2.30 2. Fix conv and resize UTs - Make conv's weight as initializer to let graph.NumberOfNodes() match ep_nodes, which should be 1. - Update resize UT because "round_prefer_floor" is no longer supported in QNN SDK since 2.21. ### Motivation and Context 1. Make the UT of QNN EP pass as much as possible to improve the test coverage.
@microsoft-github-policy-service agree company="Qualcomm" |
TestInputDef<float> input_def({1, 2, 5, 5}, false, GetFloatDataInRange(-10.0f, 10.0f, 50)); | ||
TestInputDef<float> weight_def({1, 2, 3, 3}, false, GetFloatDataInRange(-1.0f, 5.0f, 18)); | ||
TestInputDef<float> weight_def({1, 2, 3, 3}, true, GetFloatDataInRange(-1.0f, 5.0f, 18)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does that mean for some Conv case, the weight has to be static? Should we identify such cases and do the validation in ConvOpBuilder in QNN EP to place these nodes on CPU EP instead of QNN EP?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, the previous change makes the weight of Conv static. But since this UT is used to test dynamic weight, I remove this change in the new commit. We'll try to find another way to fix this Conv UT in the following PR. Thanks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I believe the dynamic weight is supported. There are some cases with dynamic weight which are passing.
/azp run Linux QNN CI Pipeline,Windows ARM64 QNN CI Pipeline,Windows x64 QNN CI Pipeline,Linux Android Emulator QNN CI Pipeline |
### Description 1. Re-enable UTs which passed 2.30 2. Update resize UT because "round_prefer_floor" is no longer supported in QNN SDK since 2.21. ### Motivation and Context 1. Make the UT of QNN EP pass as much as possible to improve the test coverage.
…nnxruntime into dev/kuanyul/enable_qnnep_ut
/azp run Linux QNN CI Pipeline,Windows ARM64 QNN CI Pipeline,Windows x64 QNN CI Pipeline,Linux Android Emulator QNN CI Pipeline |
Azure Pipelines successfully started running 4 pipeline(s). |
/azp run Big Models,Win_TRT_Minimal_CUDA_Test_CI,Windows CPU CI Pipeline,Windows GPU CUDA CI Pipeline,Windows GPU DML CI Pipeline,Windows GPU Doc Gen CI Pipeline,Windows GPU TensorRT CI Pipeline |
/azp run Linux CPU CI Pipeline, Linux CPU Minimal Build E2E CI Pipeline, Linux GPU CI Pipeline, Linux GPU TensorRT CI Pipeline, MacOS CI Pipeline, ONNX Runtime Web CI Pipeline, onnxruntime-binary-size-checks-ci-pipeline |
Azure Pipelines successfully started running 7 pipeline(s). |
1 similar comment
Azure Pipelines successfully started running 7 pipeline(s). |
Description
Motivation and Context