Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve test assertions #135

Closed
jake-arkinstall opened this issue Jul 4, 2024 · 1 comment
Closed

Improve test assertions #135

jake-arkinstall opened this issue Jul 4, 2024 · 1 comment
Assignees
Labels
meta Enhancements that are not directly related to the core library, but to surrounding tooling etc.

Comments

@jake-arkinstall
Copy link
Collaborator

At present, there exist tests that make np.isclose assertions on fidelity outcomes.

This means that an update that is beneficial to fidelity could result in failing tests. I have this problem on my machine, for instance.

I believe a good course of action is thus:

  • Use one-sided bounds on assertions that provide metrics that we are happy to see improvements on
  • Use a benchmarking approach alongside the one-sided bounds testing to track those metrics and report when there is an improvement or deterioration.
@jake-arkinstall jake-arkinstall self-assigned this Jul 4, 2024
@jake-arkinstall jake-arkinstall added the meta Enhancements that are not directly related to the core library, but to surrounding tooling etc. label Jul 4, 2024
jake-arkinstall added a commit that referenced this issue Oct 23, 2024
@jake-arkinstall
Copy link
Collaborator Author

Half done.

Now using upper bounds. It's clear we want benchmarks though so we can see when there's a performance degradation.

@jcqc jcqc closed this as completed Nov 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
meta Enhancements that are not directly related to the core library, but to surrounding tooling etc.
Projects
None yet
Development

No branches or pull requests

2 participants