Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cover tests #273

Open
wants to merge 11 commits into
base: master
Choose a base branch
from
Open

cover tests #273

wants to merge 11 commits into from

Conversation

CJ-Wright
Copy link
Member

No description provided.

@codecov
Copy link

codecov bot commented Sep 9, 2019

Codecov Report

Merging #273 into master will decrease coverage by 9.28%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #273      +/-   ##
==========================================
- Coverage   95.42%   86.14%   -9.29%     
==========================================
  Files          13       22       +9     
  Lines        1661     3781    +2120     
==========================================
+ Hits         1585     3257    +1672     
- Misses         76      524     +448     
Impacted Files Coverage Δ
streamz/dataframe/tests/test_cudf_dataframes.py 2.97% <ø> (ø)
streamz/dataframe/tests/test_dataframe_utils.py 100.00% <ø> (ø)
streamz/utils_test.py 93.75% <0.00%> (-0.44%) ⬇️
streamz/sources.py 96.88% <0.00%> (-0.35%) ⬇️
streamz/dataframe/core.py 91.60% <0.00%> (-0.01%) ⬇️
streamz/dask.py 100.00% <0.00%> (ø)
streamz/tests/test_kafka.py 89.03% <0.00%> (ø)
streamz/dataframe/tests/test_dataframes.py 96.86% <0.00%> (ø)
streamz/tests/test_graph.py 100.00% <0.00%> (ø)
streamz/tests/test_core.py 97.13% <0.00%> (ø)
... and 5 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 9c8d3bb...8735ad8. Read the comment docs.

@CJ-Wright
Copy link
Member Author

@martindurant I think it would be good to have coverage of the tests. We can pragma: no_cover things that we won't cover.

@martindurant
Copy link
Member

Is it normal to consider tests as part of coverage? We would want to know of tests that didn't run, but it doesn't seem to be a useful metric for the package as a whole. Actually, we have two separate numbers in the report, so probably don't need a limit on both.

I don't really mind though - but had better indeed mark some things we don't use. I don't know why the coverage appears to have gone down significantly.

@CJ-Wright
Copy link
Member Author

I've been trying to exclude the test chunk from the library chunk.
I think the coverage is low because we don't run many of the dataframe tests.

The coverage of the tests help make certain that we are running the tests, and show if we miss some chunks of test for some reason or another. (I stole the target from matplotlib)

@martindurant
Copy link
Member

Whilst I agree,

  1. So why are we not running those DF tests?
  2. why is the coverage diff negative?

@CJ-Wright
Copy link
Member Author

  1. Many of the tests are cudf tests, which we can't really run on travis. We can leave them in, but we'll need to list them as no cover.
  2. When the not covered tests are averaged in with the covered things it drops the percentage (I'm still working on test exclusion)

@martindurant
Copy link
Member

OK, so you can blanket ignore cuDF tests. It would be nice if coverage had a mechanism like pytest marks to set this automatically.

@martindurant
Copy link
Member

please fix the merge conflict, if you have the time.

@chinmaychandak
Copy link
Contributor

I guess we can close this now, since we have #pragma no cover and also flaky markers for tests? Or are there places where we can improve?

@CJ-Wright
Copy link
Member Author

Do those cover the purpose of this PR? This PR was supposed to add an additional check to see the coverage of the test code itself. That way we could be certain that the entire test suite was being run (in case we missed anything and pieces of the tests were actually not being run)

@chinmaychandak
Copy link
Contributor

Oh, I understand now. I misunderstood the purpose of this PR then. Apologies!

@CJ-Wright
Copy link
Member Author

CJ-Wright commented Sep 2, 2020

No problem, I should really merge master into this

@chinmaychandak
Copy link
Contributor

We no longer have any cuDF tests in streamz, though, so the coverage is back on track. And branches of execution which are not tested because of cuDF/cuStreamz-specific functionality have been marked with #pragma no cover. Maybe that should be enough? How can we check if we still need to cover tests?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants