Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

clear AND reload analysis config on tests’ tearDown #1003

Merged
merged 2 commits into from
Dec 21, 2024

Conversation

JGreenlee
Copy link
Contributor

There are several tests that call etc.set_analysis_config in their setUp() to set config options in certain testing scenarios. This creates a config file, and in tearDown of each test the config file is deleted via os.remove.

However, just deleting the file is not enough!

When emission.analysis.config is intialized, we read the config and cache it in a variable config_data

config_data = get_config_data()
When get_config() is called while the pipeline is running, it returns the value of config_data, which can be out of sync with the contents of the config file if the config file was added/removed/modified and reload_config() was not called. So instead of just calling os.remove() these tests also need to call reload_config() in their tearDown().

Added this by way of a new function in emission.tests.common

Also, I made the config file pathnames into constants for tidiness and to eliminate the risk of typo-related bugs

There are several tests that call `etc.set_analysis_config` in their `setUp()` to set config options in certain testing scenarios. This creates a config file, and in tearDown of each test the config file is deleted via os.remove.

However, just deleting the file is not enough!

When emission.analysis.config is intialized, we read the config and cache it in a variable `config_data` https://github.com/e-mission/e-mission-server/blob/4034533bfc95dbd7c976146341acd3527f9bc7c9/emission/analysis/config.py#L19
When get_config() is called while the pipeline is running, it returns the value of `config_data`, which can be out of sync with the contents of the config file if the config file was added/removed/modified and `reload_config()` was not called. So instead of just calling `os.remove()` these tests also need to call `reload_config()` in their `tearDown()`.

Added this by way of a new function in `emission.tests.common`

Also, I made the config file pathnames into constants for tidiness and to eliminate the risk of typo-related bugs
@JGreenlee JGreenlee marked this pull request as ready for review December 19, 2024 15:22
@shankari shankari changed the base branch from master to random-forest-mode-detection December 21, 2024 20:22
@shankari shankari changed the base branch from random-forest-mode-detection to master December 21, 2024 20:22
@shankari shankari merged commit af72d04 into e-mission:master Dec 21, 2024
5 checks passed
@shankari
Copy link
Contributor

@JGreenlee I have merged this since it is clearly the right thing to do, but I don't understand how the analysis config would affect the pipeline timestamps, which is what you reported in

#993 (comment)

We do not understand why the pipeline start ts comes up as 1440688707.867 during runAllTests.sh.

Is the pipeline not run properly because of the configs? But in that case, the pipeline range should not be set at all (it should be like a new user), right? Can you explain this further?

@JGreenlee
Copy link
Contributor Author

JGreenlee commented Dec 23, 2024

Is the pipeline not run properly because of the configs? But in that case, the pipeline range should not be set at all (it should be like a new user), right? Can you explain this further?

The pipeline runs properly but behaves differently depending on "intake.cleaning.filter_accuracy.enable" in the analysis config.
When true, the first trip on Aug 27 starts at 1440688707.867.
When false, that trip starts a bit later at 1440688739.672.

When the test was run in isolation, "intake.cleaning.filter_accuracy.enable" was false.
When the test was run during runAllTests.sh, it was true.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Tasks completed
Development

Successfully merging this pull request may close these issues.

2 participants