Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Python 3.12 (and adjust code for tokenizer changes). #197

Merged
merged 3 commits into from
Nov 30, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 9 additions & 5 deletions .github/workflows/tox.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,20 +11,24 @@ jobs:
strategy:
max-parallel: 4
matrix:
python-version: ['2.7', '3.6', '3.7', '3.8', '3.9', '3.10', '3.11-dev']
python-version: ['2.7', '3.6', '3.7', '3.8', '3.9', '3.10', '3.11', '3.12']
pandas-presence: ['with_pandas', 'without_pandas']
env:
PYTHON_VERSION: ${{ matrix.python-version }}
PANDAS_PRESENCE: ${{ matrix.pandas-presence }}
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
- uses: gabrielfalcao/pyenv-action@v17
with:
python-version: ${{ matrix.python-version }}
default: "${{ matrix.python-version }}"
- name: Install Python ${{ matrix.python-version }}
run: |
pyenv install "${{ matrix.python-version }}"
pyenv local "${{ matrix.python-version }}"
pyenv versions
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -U pip
pip install tox tox-gh-actions
- name: Test with tox
run: |
Expand Down
8 changes: 4 additions & 4 deletions patsy/parse_formula.py
Original file line number Diff line number Diff line change
Expand Up @@ -84,15 +84,15 @@ def _tokenize_formula(code, operator_strings):
# "magic" token does:
end_tokens = set(magic_token_types)
end_tokens.remove("(")

it = PushbackAdapter(python_tokenize(code))
for pytype, token_string, origin in it:
if token_string in magic_token_types:
yield Token(magic_token_types[token_string], origin)
else:
it.push_back((pytype, token_string, origin))
yield _read_python_expr(it, end_tokens)

def test__tokenize_formula():
code = "y ~ a + (foo(b,c + 2)) + -1 + 0 + 10"
tokens = list(_tokenize_formula(code, ["+", "-", "~"]))
Expand Down Expand Up @@ -274,8 +274,8 @@ def _parsing_error_test(parse_fn, error_descs): # pragma: no cover
except PatsyError as e:
print(e)
assert e.origin.code == bad_code
assert e.origin.start == start
assert e.origin.end == end
assert e.origin.start in (0, start)
assert e.origin.end in (end, len(bad_code))
else:
assert False, "parser failed to report an error!"

Expand Down
13 changes: 9 additions & 4 deletions patsy/tokens.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,11 +31,10 @@ def python_tokenize(code):
for (pytype, string, (_, start), (_, end), code) in it:
if pytype == tokenize.ENDMARKER:
break
origin = Origin(code, start, end)
assert pytype != tokenize.NL
if pytype == tokenize.NEWLINE:
if pytype in (tokenize.NL, tokenize.NEWLINE):
assert string == ""
continue
origin = Origin(code, start, end)
if pytype == tokenize.ERRORTOKEN:
raise PatsyError("error tokenizing input "
"(maybe an unclosed string?)",
Expand All @@ -53,8 +52,14 @@ def python_tokenize(code):
# end of the source text. We have our own error handling for
# such cases, so just treat this as an end-of-stream.
#
if "unterminated string literal" in e.args[0]:
raise PatsyError(
"error tokenizing input ({})".format(e.args[0]),
Origin(code, 0, len(code)),
)

# Just in case someone adds some other error case:
assert e.args[0].startswith("EOF in multi-line")
assert "EOF in multi-line" in e.args[0]
return

def test_python_tokenize():
Expand Down
1 change: 1 addition & 0 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,7 @@
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Topic :: Scientific/Engineering",
],
)
5 changes: 4 additions & 1 deletion tox.ini
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[tox]
envlist = {py27,py36,py37,py38,py39,py310,py311}-{with_pandas,without_pandas}
envlist = {py27,py36,py37,py38,py39,py310,py311,py312}-{with_pandas,without_pandas}

[gh-actions]
python =
Expand All @@ -10,6 +10,7 @@ python =
3.9: py39
3.10: py310
3.11: py311
3.12: py312

[testenv]
deps=
Expand All @@ -25,6 +26,8 @@ setenv=
OMP_NUM_THREADS=1
MKL_NUM_THREADS=1
VML_NUM_THREADS=1
allowlist_externals=
env
commands=
pytest -vv --cov=patsy --cov-config={toxinidir}/.coveragerc --cov-report=term-missing --cov-report=xml --cov-report=html:{toxworkdir}/coverage/{envname} {posargs:}
env PATSY_AVOID_OPTIONAL_DEPENDENCIES=1 pytest -vv --cov=patsy --cov-config={toxinidir}/.coveragerc --cov-report=term-missing --cov-report=xml --cov-report=html:{toxworkdir}/coverage/{envname} {posargs:}
Expand Down
Loading