Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automatically use thread-sharded for core-shard-preferring tools on core-sharded-on-disk drmemtraces #7045

Open
derekbruening opened this issue Oct 17, 2024 · 1 comment

Comments

@derekbruening
Copy link
Contributor

Core-sharded-on-disk (CSOD) files are generally analyzed with thread-sharded scheduler mode, which is still core-sharded for analysis tools: it just doesn't try to re-schedule the already-scheduled file.

PR #7042 is making it a fatal error to re-schedule a CSOD file as that almost always indicates user error. We'd like to improve this in 2 ways:

  1. Detect in the scheduler for uses beyond analyzers
  2. Automatically run thread-sharded for a core-sharded-preferred tool on a CSOD trace: instead of the user hitting the fatal error and having to re-run with -no_core_sharded.

These are tricky because of the code structure and what is known when: the analyzer_multi layer that sets the scheduler params doesn't know whether it's a CSOD file as it doesn't open up any trace files and there's no metadata.

I tried having the scheduler return a special error code and having analyzer.cpp detect and re-initialize a brand-new scheduler: but this is messy as analyzer needs to know things only in analyzer_multi, and the scheduler error point is not clear: code refactoring is required to get it early.

One solution might be adding a metadata file: that could help other tasks too.

@derekbruening
Copy link
Contributor Author

Note that item 2 above is only when no mode is specified: if the user specifies core-sharding there could be config differences and we should keep the fatal error and not try to continue.

derekbruening added a commit that referenced this issue Oct 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant