Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update constants to use values from CODATA2022 #5661

Open
wants to merge 7 commits into
base: development
Choose a base branch
from

Conversation

dpgrote
Copy link
Member

@dpgrote dpgrote commented Feb 12, 2025

The values of the physical constants were from CODATA 2018. These should be updated to the current accepted values as specified in CODATA 2022.

This breaks many CI benchmarks since the checks are at a higher precision than the changes in the constants.

Note that scipy recently updated the constants to use CODATA 2022 (in version 1.15.0 on January 3). This may cause problems in the CI tests. However, we use Ubuntu 20.4 to run the tests, and there, the version of scipy is 1.3.3 which uses constants from CODATA 2014!

These CI tests needed to be updated.

  • test_rz_galilean_psatd_current_correction_psb needed the charge conservation tolerance increase. This seems to be chance. Running this test locally (on Mac) passes the tolerance check

@EZoni
Copy link
Member

EZoni commented Feb 12, 2025

Thanks, Dave.

If there is a large number of benchmarks that need to be reset, this could be a good opportunity to test again our tool Tools/DevUtils/update_benchmarks_from_azure_output.py and its instructions in our documentation.

In theory, I updated and tested the tool manually in #5372. However, it is not tested automatically yet.

@dpgrote
Copy link
Member Author

dpgrote commented Feb 12, 2025

Thanks, Dave.

If there is a large number of benchmarks that need to be reset, this could be a good opportunity to test again our tool Tools/DevUtils/update_benchmarks_from_azure_output.py and its instructions in our documentation.

In theory, I updated and tested the tool manually in #5372. However, it is not tested automatically yet.

Thanks @EZoni ! It worked and was easy to do. BTW, to download the raw log file, I copied the URL from he location bar and pasted it into the curl command, "curl https://dev.azure.com/ECP-WarpX/... > raw_log", making it easy to download it.

Note that almost all of the changes in the benchmarks are small as expected, ~1.e-9 or smaller. One exception is the test_3d_beam_beam_collision test with errors of order 10% are seen, presumably because it runs a long time allowing the differences to grow.

@EZoni
Copy link
Member

EZoni commented Feb 13, 2025

Note that almost all of the changes in the benchmarks are small as expected, ~1.e-9 or smaller. One exception is the test_3d_beam_beam_collision test with errors of order 10% are seen, presumably because it runs a long time allowing the differences to grow.

I agree. I think that test has relatively large tolerances anyways, if I remember correctly. @aeriforme, what do you think?

@EZoni
Copy link
Member

EZoni commented Feb 13, 2025

Thanks @EZoni ! It worked and was easy to do. BTW, to download the raw log file, I copied the URL from he location bar and pasted it into the curl command, "curl https://dev.azure.com/ECP-WarpX/... > raw_log", making it easy to download it.

Thanks for pointing this out! I added this hint to our documentation in #5663.

@ax3l ax3l added the component: core Core WarpX functionality label Feb 19, 2025
@ax3l ax3l mentioned this pull request Feb 19, 2025
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component: core Core WarpX functionality
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants