-
Notifications
You must be signed in to change notification settings - Fork 435
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[DevX] Support filtering benchmark configs in on-demand workflow #7639
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/7639
Note: Links to docs will display an error until the docs builds have been completed. ⏳ No Failures, 26 PendingAs of commit b7390c2 with merge base 74aace6 (): This comment was automatically generated by Dr. CI and updates every 15 minutes. |
bac0256
to
ed7f187
Compare
ed7f187
to
e7f89f5
Compare
e7f89f5
to
4fdd159
Compare
4fdd159
to
e7ab81d
Compare
e7ab81d
to
003e0ff
Compare
003e0ff
to
8335365
Compare
@huydhn Could you help review this PR? Do you have any idea about the permission issue?
|
b0b615e
to
f91efef
Compare
Write to stdout in unit tests due to permission to access |
f91efef
to
89a1daa
Compare
@guangy10 has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
Let me take a look at the PR. For that unit test failures, let's try unset |
yeah, just updated the code to write to stdout instead if raising a permission exception. Rerunning the test now |
The tests pass on Linux but failed on macOS runner. It seems like "::set-ouput" is not set on macOS runner. Expected to have:
but only got
And I can't reproduce it on my mac. |
I think I get it now. MacOS job doesn't run inside a Docker container, but on the host runner itself. So, the script has access to the GITHUB_OUTPUT environment variable. It passes locally because there is no such variable. I think the test need to take this into account and check both stdout and the content of GITHUB_OUTPUT (if it exists) |
I see, or I can simply making it a Linux-only test, which seems better as in all CI we are running this step on Linux only |
89a1daa
to
317dc11
Compare
@guangy10 has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
317dc11
to
77b2372
Compare
77b2372
to
b7390c2
Compare
@guangy10 has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
@kimishpatel @digantdesai Just fyi, subscribed you to DevX related work. |
Support config filtering in ondemand benchmark flow Co-authored-by: Github Executorch <[email protected]>
…orch#7639) Support config filtering in ondemand benchmark flow Co-authored-by: Github Executorch <[email protected]>
#7349 introduced benchmark configs which automatically detect all added supported delegates/quants/dtypes for a given model. It works well for the automated nightly benchmark jobs, however, for the on-demand jobs, given a model, it will run all supported configs for a model even if the user would like to run for target config. This PR is improving the DevX by support filtering benchmark configs for on-demand workflow
For example,
will generate the matrix for qnn_q8 even if the models support other configs such as xnnpack_q8:
Trigger an on-demand benchmark via GitHub UI:
Unit Tests:
Test jobs: