Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New failure on 3.12 while updating hypothesis to 6.111.0 #39

Open
sobolevn opened this issue Aug 12, 2024 · 1 comment
Open

New failure on 3.12 while updating hypothesis to 6.111.0 #39

sobolevn opened this issue Aug 12, 2024 · 1 comment

Comments

@sobolevn
Copy link

Output:

 =================================== FAILURES ===================================
______________________________ test_no_exceptions ______________________________

default_options = options(min_name_length=2, max_name_length=45, i_control_code=True, i_dont_control_code=True, max_noqa_comments=10, ne...level=3, max_annotation_complexity=3, max_import_from_members=8, max_tuple_unpack_length=4, show_violation_links=False)
parse_ast_tree = <function parse_ast_tree.<locals>.factory at 0x7f74194c0360>

    @given(source_code=hypothesmith.from_grammar().map(_fixup))
>   @settings(print_blob=True)

tests/test_checker/test_hypothesis.py:40: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
tests/test_checker/test_hypothesis.py:59: in test_no_exceptions
    tokens = list(tokenize.generate_tokens(lambda: next(lines)))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

source = <function test_no_exceptions.<locals>.<lambda> at 0x7f74157b5bc0>
encoding = None, extra_tokens = True

    def _generate_tokens_from_c_tokenizer(source, encoding=None, extra_tokens=False):
        """Tokenize a source reading Python code as unicode strings using the internal C tokenizer"""
        if encoding is None:
            it = _tokenize.TokenizerIter(source, extra_tokens=extra_tokens)
        else:
            it = _tokenize.TokenizerIter(source, encoding=encoding, extra_tokens=extra_tokens)
        try:
            for info in it:
                yield TokenInfo._make(info)
        except SyntaxError as e:
            if type(e) != SyntaxError:
                raise e from None
            msg = _transform_msg(e.msg)
>           raise TokenError(msg, (e.lineno, e.offset)) from None
E           tokenize.TokenError: ('unexpected character after line continuation character', (1, 20))
E           Falsifying example: test_no_exceptions(
E               default_options=options(min_name_length=2, max_name_length=45, i_control_code=True, i_dont_control_code=True, max_noqa_comments=10, nested_classes_whitelist=('Meta', 'Params', 'Config'), allowed_domain_names=(), forbidden_domain_names=(), forbidden_inline_ignore=(), exps_for_one_empty_line=2, max_returns=5, max_local_variables=5, max_expressions=9, max_arguments=5, max_module_members=7, max_methods=7, max_line_complexity=14, max_jones_score=12, max_imports=12, max_imported_names=50, max_base_classes=3, max_decorators=5, max_string_usages=3, max_awaits=5, max_try_body_length=1, max_module_expressions=7, max_function_expressions=4, max_asserts=5, max_access_level=4, max_attributes=6, max_raises=3, max_cognitive_score=12, max_cognitive_average=8, max_call_level=3, max_annotation_complexity=3, max_import_from_members=8, max_tuple_unpack_length=4, show_violation_links=False),
E               parse_ast_tree=factory,
E               source_code='pass;globalmatch\\;#\n',
E           )
E           Explanation:
E               These lines were always and only run by failing examples:
E                   /opt/hostedtoolcache/Python/3.12.4/x64/lib/python3.12/dis.py:474
E                   /opt/hostedtoolcache/Python/3.12.4/x64/lib/python3.12/dis.py:643
E                   /opt/hostedtoolcache/Python/3.12.4/x64/lib/python3.12/tokenize.py:578
E           
E           You can reproduce this example by temporarily adding @reproduce_failure('6.111.0', b'AXicY2RkYGZmZAABNgYGMIOREUYAAQAB3QAV') as a decorator on your test case
E           Highest target scores:
E                          4  (label='(hypothesmith) instructions in bytecode')
E                          5  (label='(hypothesmith) number of unique ast node types')
E                          5  (label='(hypothesmith) total number of ast nodes')

/opt/hostedtoolcache/Python/3.12.4/x64/lib/python3.12/tokenize.py:582: TokenError
=============================== warnings summary ===============================
tests/test_visitors/test_tokenize/test_primitives/test_string_tokens/test_unicode_escape.py::test_wrong_unicode_escape[b'\\Ua']
  <filename>:1: SyntaxWarning: invalid escape sequence '\U'

tests/test_visitors/test_tokenize/test_primitives/test_string_tokens/test_unicode_escape.py::test_wrong_unicode_escape[b'\\u1']
tests/test_visitors/test_tokenize/test_primitives/test_string_tokens/test_unicode_escape.py::test_wrong_unicode_escape[b'\\ua']
  <filename>:1: SyntaxWarning: invalid escape sequence '\u'

tests/test_visitors/test_tokenize/test_primitives/test_string_tokens/test_unicode_escape.py::test_wrong_unicode_escape[b'\\N{GREEK SMALL LETTER ALPHA}']
  <filename>:1: SyntaxWarning: invalid escape sequence '\N'

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html

---------- coverage: platform linux, python 3.12.4-final-0 -----------
Name    Stmts   Miss Branch BrPart  Cover   Missing
---------------------------------------------------
TOTAL    6616      0   3143      0   100%

144 files skipped due to complete coverage.
Coverage HTML written to dir htmlcov
Coverage XML written to file coverage.xml

Required test coverage of 100% reached. Total coverage: 100.00%
--------------------------- snapshot report summary ----------------------------
36 snapshots passed.
================================== Hypothesis ==================================
`git apply .hypothesis/patches/2024-08-12--eb0349ac.patch` to add failing examples to your code.
=========================== short test summary info ============================
FAILED tests/test_checker/test_hypothesis.py::test_no_exceptions - tokenize.TokenError: ('unexpected character after line continuation character', (1, 20))
Falsifying example: test_no_exceptions(
    default_options=options(min_name_length=2, max_name_length=45, i_control_code=True, i_dont_control_code=True, max_noqa_comments=10, nested_classes_whitelist=('Meta', 'Params', 'Config'), allowed_domain_names=(), forbidden_domain_names=(), forbidden_inline_ignore=(), exps_for_one_empty_line=2, max_returns=5, max_local_variables=5, max_expressions=9, max_arguments=5, max_module_members=7, max_methods=7, max_line_complexity=14, max_jones_score=12, max_imports=12, max_imported_names=50, max_base_classes=3, max_decorators=5, max_string_usages=3, max_awaits=5, max_try_body_length=1, max_module_expressions=7, max_function_expressions=4, max_asserts=5, max_access_level=4, max_attributes=6, max_raises=3, max_cognitive_score=12, max_cognitive_average=8, max_call_level=3, max_annotation_complexity=3, max_import_from_members=8, max_tuple_unpack_length=4, show_violation_links=False),
    parse_ast_tree=factory,
    source_code='pass;globalmatch\\;#\n',
)
Explanation:
    These lines were always and only run by failing examples:
        /opt/hostedtoolcache/Python/3.12.4/x64/lib/python3.12/dis.py:474
        /opt/hostedtoolcache/Python/3.12.4/x64/lib/python3.12/dis.py:643
        /opt/hostedtoolcache/Python/3.12.4/x64/lib/python3.12/tokenize.py:578

You can reproduce this example by temporarily adding @reproduce_failure('6.111.0', b'AXicY2RkYGZmZAABNgYGMIOREUYAAQAB3QAV') as a decorator on your test case
Highest target scores:
               4  (label='(hypothesmith) instructions in bytecode')
               5  (label='(hypothesmith) number of unique ast node types')
               5  (label='(hypothesmith) total number of ast nodes')
===== 1 failed, 34776 passed, 13 skipped, 4 warnings in 288.11s (0:04:48) ======

Log: https://github.com/wemake-services/wemake-python-styleguide/actions/runs/10345104431/job/28631643926?pr=3036

@sobolevn
Copy link
Author

It happened again:

Explanation:
    These lines were always and only run by failing examples:
        /opt/hostedtoolcache/Python/3.12.5/x64/lib/python3.12/tokenize.py:578

Log: https://github.com/wemake-services/wemake-python-styleguide/actions/runs/10552646231/job/29231772630?pr=3040

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant