Skip to content

Commit

Permalink
Merge pull request #314 from t20100/update-doc
Browse files Browse the repository at this point in the history
Thanks @payno  for the reviews!
  • Loading branch information
t20100 authored Jul 24, 2024
2 parents 193a0ac + 9944c6a commit ad7622d
Show file tree
Hide file tree
Showing 7 changed files with 36 additions and 47 deletions.
52 changes: 26 additions & 26 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,46 +8,46 @@ on:

jobs:
build_sdist:
name: Build source distribution
name: Build and test source distribution
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: '3.11'
python-version: '3.12'
cache: 'pip'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install build twine
- name: Build sdist
run: python -m build --sdist
- name: Check the package
run: |
python -m twine check dist/*
- run: python -m pip install --upgrade pip build twine
- run: python -m build --sdist
- run: python -m twine check dist/*
- run: pip install --pre "$(ls dist/hdf5plugin*.tar.gz)[test]"
- run: python test/test.py
- uses: actions/upload-artifact@v4
with:
name: cibw-sdist
path: dist/*.tar.gz

test_sdist:
needs: [build_sdist]
name: Test source distribution
build_doc:
name: Build documentation
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: '3.11'
cache: 'pip'
- uses: actions/download-artifact@v4
python-version: "3.12"
cache: "pip"
- run: sudo apt-get install pandoc
- run: pip install .[doc]
env:
HDF5PLUGIN_STRIP: all # Do not build the filters
- name: Build doc
run: |
export OUTPUT_NAME="hdf5plugin-$(python -c 'import hdf5plugin; print(hdf5plugin.version)')_documentation"
sphinx-build doc/ "${OUTPUT_NAME}/"
zip -r "${OUTPUT_NAME}.zip" "${OUTPUT_NAME}/"
- uses: actions/upload-artifact@v4
with:
name: cibw-sdist
path: dist
- name: Install sdist
run: pip install --pre "$(ls dist/hdf5plugin*.tar.gz)[test]"
- name: Run tests
run: python test/test.py
name: documentation
path: hdf5plugin-*_documentation.zip

build_wheels:
name: Build wheels on ${{ matrix.os }}-${{ matrix.cibw_archs }}
Expand Down Expand Up @@ -94,7 +94,7 @@ jobs:
HDF5PLUGIN_CPP20: "True"
MACOSX_DEPLOYMENT_TARGET: "10.13"

CIBW_ENVIRONMENT_PASS_LINUX: HDF5PLUGIN_OPENMP HDF5PLUGIN_NATIVE HDF5PLUGIN_SSE2 HDF5PLUGIN_SSSE3 HDF5PLUGIN_AVX2 HDF5PLUGIN_AVX512 HDF5PLUGIN_BMI2 HDF5PLUGIN_CPP11 HDF5PLUGIN_CPP14
CIBW_ENVIRONMENT_PASS_LINUX: HDF5PLUGIN_OPENMP HDF5PLUGIN_NATIVE HDF5PLUGIN_SSE2 HDF5PLUGIN_SSSE3 HDF5PLUGIN_AVX2 HDF5PLUGIN_AVX512 HDF5PLUGIN_BMI2 HDF5PLUGIN_CPP11 HDF5PLUGIN_CPP14 HDF5PLUGIN_CPP20

CIBW_BUILD_VERBOSITY: 1
# Use Python3.11 to build wheels that are compatible with all supported version of Python
Expand Down Expand Up @@ -143,7 +143,7 @@ jobs:
# First select the right wheel from dist/ with pip download, then install it
run: |
pip download --no-index --no-cache --no-deps --find-links=./dist --only-binary :all: hdf5plugin
pip install "$(ls ./hdf5plugin-*.whl)[test]" --only-binary blosc2 || pip install "$(ls ./hdf5plugin-*.whl)"
pip install "$(ls ./hdf5plugin-*.whl)[test]"
- name: Run test with latest h5py
run: python test/test.py
- name: Run test with oldest h5py
Expand All @@ -152,7 +152,7 @@ jobs:
python test/test.py
pypi-publish:
needs: [build_wheels, build_sdist, test_wheels, test_sdist]
needs: [build_wheels, build_sdist, build_doc, test_wheels]
name: Upload release to PyPI
runs-on: ubuntu-latest
environment:
Expand Down
4 changes: 2 additions & 2 deletions doc/contribute.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
Contribute
============

This project follows the standard open-source project github workflow, which is described in other projects like `matplotlib <https://matplotlib.org/devel/contributing.html#contributing-code>`_ or `scikit-image <https://scikit-image.org/docs/dev/contribute.html>`_.
This project follows the standard open-source project github workflow, which is described in other projects like `scikit-image <https://scikit-image.org/docs/stable/development/contribute.html>`_.

Testing
=======
Expand Down Expand Up @@ -53,7 +53,7 @@ This briefly describes the steps to add a HDF5 compression filter to the zoo.
* In case of import errors related to HDF5-related undefined symbols, add eventual missing functions under ``src/hdf5_dl.c``.

* Add a "CONSTANT" in ``src/hdf5plugin/_filters.py`` named with the ``FILTER_NAME_ID`` which value is the HDF5 filter ID
(See `HDF5 registered filters <https://portal.hdfgroup.org/documentation/hdf5-docs/registered_filter_plugins.html>`_).
(See `HDF5 registered filters <https://github.com/HDFGroup/hdf5_plugins/blob/master/docs/RegisteredFilterPlugins.md#list-of-filters-registered-with-the-hdf-group>`_).

* Add a compression options helper class named ``FilterName`` in ``hdf5plugins/_filters.py`` which should inherit from ``_FilterRefClass``.
This is intended to ease the usage of ``h5py.Group.create_dataset`` ``compression_opts`` argument.
Expand Down
7 changes: 1 addition & 6 deletions doc/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -34,9 +34,4 @@ Alternatives to install HDF5 compression filters are: system-wide installation o
contribute.rst
changelog.rst

Indices and tables
==================

* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
:ref:`genindex`
8 changes: 1 addition & 7 deletions doc/information.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,14 +18,8 @@ Project resources

- `Source repository <https://github.com/silx-kit/hdf5plugin>`_
- `Issue tracker <https://github.com/silx-kit/hdf5plugin/issues>`_
- Continuous integration: *hdf5plugin* is continuously tested on all three major
operating systems:

- Linux, MacOS, Windows: `GitHub Actions <https://github.com/silx-kit/hdf5plugin/actions>`_
- Windows: `AppVeyor <https://ci.appveyor.com/project/ESRF/hdf5plugin>`_
- `Weekly builds <https://silx.gitlab-pages.esrf.fr/bob/hdf5plugin/>`_

`hdf5plugin` can be cited with its `Zenodo DOI <https://doi.org/10.5281/zenodo.7257761>`_.
`hdf5plugin` can be cited with its DOI: `10.5281/zenodo.7257761 <https://doi.org/10.5281/zenodo.7257761>`_.

Presentations
-------------
Expand Down
2 changes: 1 addition & 1 deletion doc/usage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ As for reading compressed datasets, ``import hdf5plugin`` is required to enable

To create a compressed dataset use `h5py.Group.create_dataset`_ and set the ``compression`` and ``compression_opts`` arguments.

``hdf5plugin`` provides helpers to prepare those compression options: `Bitshuffle`_, `Blosc`_, `BZip2`_, `FciDecomp`_, `LZ4`_, `SZ`_, `SZ3`_, `Zfp`_, `Zstd`_.
``hdf5plugin`` provides helpers to prepare those compression options: `Bitshuffle`_, `Blosc`_, `Blosc2`_, `BZip2`_, `FciDecomp`_, `LZ4`_, `Sperr`_, `SZ`_, `SZ3`_, `Zfp`_, `Zstd`_.

Sample code:

Expand Down
4 changes: 2 additions & 2 deletions src/hdf5plugin/_filters.py
Original file line number Diff line number Diff line change
Expand Up @@ -596,7 +596,7 @@ class SZ(h5py.filters.FilterRefBase):
data=numpy.random.random(100),
compression=hdf5plugin.SZ(pointwise_relative=0.01))
For more details about the compressor `SZ <https://szcompressor.org/>`_.
For more details about the compressor, see `SZ compressor <https://github.com/szcompressor/SZ>`_.
"""
filter_name = "sz"
filter_id = SZ_ID
Expand Down Expand Up @@ -649,7 +649,7 @@ class SZ3(h5py.filters.FilterRefBase):
data=numpy.random.random(100),
compression=hdf5plugin.SZ3(absolute=0.1))
For more details about the compressor, see `SZ3 <https://szcompressor.org/>`_.
For more details about the compressor, see `SZ3 compressor <https://github.com/szcompressor/SZ3>`_.
.. warning::
Expand Down
6 changes: 3 additions & 3 deletions src/hdf5plugin/_version.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,8 @@ class _VersionInfo(NamedTuple):
releaselevel: str = "final"
serial: int = 0

@staticmethod
def from_string(version: str) -> "_VersionInfo":
@classmethod
def from_string(cls, version: str) -> "_VersionInfo":
pattern = r"(?P<major>\d+)\.(?P<minor>\d+)\.(?P<micro>\d+)((?P<prerelease>a|b|rc)(?P<serial>\d+))?"
match = re.fullmatch(pattern, version, re.ASCII)
fields = {k: v for k, v in match.groupdict().items() if v is not None}
Expand All @@ -50,7 +50,7 @@ def from_string(version: str) -> "_VersionInfo":
]
version_fields = {k: int(v) for k, v in fields.items()}

return _VersionInfo(releaselevel=releaselevel, **version_fields)
return cls(releaselevel=releaselevel, **version_fields)


version_info = _VersionInfo.from_string(version)

0 comments on commit ad7622d

Please sign in to comment.