-
Notifications
You must be signed in to change notification settings - Fork 240
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Torch][WeightCompression] Add Scale Estimation data-aware support #3179
base: develop
Are you sure you want to change the base?
Conversation
|
||
@pytest.mark.parametrize("axes", [None, (0,), (0, 1), (0, 1, 2)]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Comments regarding the test_reducers_and_aggregators
:
- Please join this three tests to one
- Please use reference inputs/outputs for the reducers as it done in all other tests (instead of a random input and reference numpy implementation)
- Please move the joined test above the aggregators tests
weight compression build - 291 |
The proposed example can be added as a follow-up PR - will be excluded from this PR |
[None, np.array([[[1, 2, 0], [1, -3, 10]], [[-1, 2, -3], [4, 5, -6]]]), 16.1666], | ||
[(0,), np.array([[[1, 2, 0], [1, -3, 10]], [[-1, 2, -3], [4, 5, -6]]]), 14.25], | ||
[(0, 1), np.array([[[1, 2, 0], [1, -3, 10]], [[-1, 2, -3], [4, 5, -6]]]), 15.875], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As each ref input is equal to each other, I believe we can create a global var like DEFAULT_VARIANCE_REF_INPUT
for it😀
Changes
Add data-aware support for Torch backend for WeightCompression with Scale Estimatiaon support
Reason for changes
Add support of Torch
Related tickets
158974
Tests
Add unittest
Add usecases for Torch to WeightCompression job