Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support scalar replacement of large structs #6019

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

daniel-story
Copy link
Contributor

This PR fixes an issue where structs with over 100 member variables are not fully optimized by the default performance recipe. This is due to the default performance recipe initializing the scalar replacement pass using a default size limit of a maximum of 100 struct/array members as opposed to the infinite size limit explicitly requested by the size and legalization passes.

Failure to fully apply scalar replacement prevents effective optimization of shaders that access small portions of large structs. This leads to optimizable code related to accesses of such structs being left unmodified, resulting in large SPIR-V binaries and poor performance on some GPUs. Not applying these optimizations was found to have a substantial effect on performance in a number of production shaders from real-world game titles, so it makes sense to follow the lead of the legalization and size recipes and also remove this limit from the default performance recipe.

With this PR, the default value for the default performance pass recipe, the ScalarReplacementPass constructor, and CreateScalarReplacementPass are updated to zero to remove the current size limit of 100.

This PR includes a test based on a simplified version of the real-world production shader that uses a struct with ~180 members where this issue was first identified. This test aims to ensure that in the future running the default performance recipe on a similar shader will produce SPIR-V code that will not benefit from further scalar replacement with no limit.

@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

@daniel-story daniel-story force-pushed the improve-default-scalar-replacement branch from 02c1d4b to 5055e3e Compare February 28, 2025 18:52
Copy link
Collaborator

@s-perron s-perron left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That limit was arbitrary, but it was put in because the compile time can get very long if sroa creates too many variables. If we remove it, there may be some follow up issues if the compile time increases for someone.

With that said, we could tell the users of spirv-opt to set their own limit based on their use cases. I think this should be fine. There is a workaround if there is a problem.

@@ -148,7 +148,7 @@ class TestPerformanceOptimizationPasses(expect.ValidObjectFile1_6,
'eliminate-local-single-block',
'eliminate-local-single-store',
'eliminate-dead-code-aggressive',
'scalar-replacement=100',
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You need to change line 116 as well.

@daniel-story daniel-story force-pushed the improve-default-scalar-replacement branch from 5055e3e to 4248b4e Compare February 28, 2025 20:50
@daniel-story daniel-story force-pushed the improve-default-scalar-replacement branch from 4248b4e to 617b91f Compare February 28, 2025 21:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants