Instructions how to contribute to and release osidb-bindings
The following dependencies are required for development and deployment:
- make
- python3
- python3 dependencies (see pip-tools)
- tox
OSIDB bindings has adopted pip-tools
as its tool of choice for python dependency management,
in this section we'll go over the basics, the similarities and the differences between pip-tools
and pip
,
as well as how to use it effectively.
With pip
, adding a dependency is as simple as adding it to the requirements.txt
,
and optionally choosing which version(s) to use.
With pip-tools
, the dependency (versioned or not) is added to either requirements.in
,
devel-requirements.in
or local-requirements.in
, then we must execute the pip-compile
command in order to generate the corresponding *-requirements.txt
.
$ pip-compile --generate-hashes --allow-unsafe # this will compile requirements.in -> requirements.txt
$ pip-compile --generate-hashes --allow-unsafe devel-requirements.in # be explicit for alternate requirements files
Instead of typing these commands manually you can simply do
$ make compile-deps
and all the necessary requirements.txt
files will be compiled correctly.
So far the differences between the two are minimal, both use a requirements.txt
file to express its dependencies, however the dependency tree generated by pip-compile
is more thorough, it will include all implicit dependencies of the ones explicitly defined in the *.in
files and will pin them to a very specific version.
Not only does this make it easier to reproduce prod/dev environments, but it can also be helpful for later security vulnerabilities scanning.
Note that if any dependencies are added to the *.in
files, and then pip-compile
is ran, the versions of the existing pinned dependencies will not change and only the new dependencies will be added to the requirements.txt
Updating dependencies with pip
and pip-tools
is largely the same, the command for doing so with pip-tools
is the following
$ pip-compile --generate-hashes --allow-unsafe --upgrade-package django --upgrade-package requests==2.0.0
To install the dependencies with pip
, you simply pass the requirements file(s) to the -r
option and all the requirements in the file will be installed, even if the file was generated by pip-compile
!
With pip-tools
, the command for installing dependencies is pip-sync requirements.txt
(or any other file generated by pip-compile
), however pip-sync
will not only install the requirements, but it will also uninstall any packages or versions that do not match the one defined in the requirements file.
If installing multiple requirements files, they can simply be passed as additional positional arguments to pip-sync
$ pip-sync requirements.txt devel-requirements.txt local-requirements.txt
Instead of running this command manually, you can also use
$ make sync-deps
⚠️ Make sure to runpip-sync
within a virtual environment, otherwise you risk having system-wide packages that are not in therequirements.txt
be uninstalled
As for what each requirements file holds, here's a quick explanation for each:
requirements.txt
: dependencies necessary for running OSIDB bindingsdevel-requirements.txt
: dependencies necessary to develop OSIDB bindingslocal-requirements.txt
: dependencies specific to your workflow (e.g.ipython
or any other custom shell/debugger)
local-requirements.txt
is a special case, it is ignored in the .gitignore
because it's specific to every developer. Without it, every time pip-sync
is ran any packages specific to your workflow would be uninstalled and would have to be manually installed.
When synchronizing multiple requirements files, it is important that every subsequent requirements files "includes" the previous one, e.g.:
# requirements.in
requests
# devel-requirements.in
-c requirements.txt
openapi-python-client
# local-requirements.in
-c devel-requirements.txt
ipython
This is so pip-sync
can properly synchronize the versions of dependencies that appear in all the requirements files.
For more information on pip-tools
and its usage, check the official documentation.
OSIDB bindings are built as a wrapper around the autogenerated python client.
We use openapi-python-client python package
which is capable of generating the python client from OpenAPI schema. We generate the python client (osidb_bindings/bindings
folder) from schema (osidb_bindings/openapi_schema.yml
) and then we use the generated client in the bindings wrapper to hide all uncessary technical stuff for user.
Since the client is not perfect the way it is generated we use a handy template mechanism offered by the package which is based on Jinja templating. This way, we are able to define our own Jinja templates in osidb_bindings/templates
which
replaces the original templates from the openapi-python-client repository. There is drawback in usage of these template, because the templating API is in beta version on not yet stable and thus the version of openapi-python-client
is precisely locked via pip-tools and should not be updated unless the templates are adjusted to a new version.
Currently osidb-bindings package is using openapi-python-client version 0.22.0
-
To generate a fresh new bindings python client you can use:
$ make create
Note: this won't work when
osidb_bindings/bindings
folder exists. Folder needs to be deleted first. -
To update the current bindings python client from updated schema file use:
$ make update
-
Create feature branch
-
Check commit is clean by running
$ tox -e ruff
-
Run tests locally
$ tox -e unit-tests
-
Update CHANGELOG.md in case of some breaking changes which affects the user
-
Push to branch
-
Confirm branch passes checks
-
Raise PR against master ensuring good title/description and bullet point all significant commits
There are two main cases when this package needs to be released.
OSIDB bindings as well as the OSIDB itself uses semantic versioning (eg. MAJOR.MINOR.PATCH, 1.2.3), however there are some extra rules for the bindings versioning to keep bindings and OSIDB in sync.
-
MAJOR and MINOR versions - these versions are only incremented based on the OSIDB version and shouldn't be incremented when some changes or fixes happens only on the side of the bindings. Whenever the MAJOR or MINOR version of OSIDB gets released, bindings should be regenerated and accomodated for that version and then released with the same version number. Eg. OSIDB is on version 1.2.0, bindings are on version 1.2.0 as well, OSIDB 1.3.0 gets released, necessary changes are made on the bindings side and bindings 1.3.0 gets released as well.
-
PATCH version - this version part is separated from OSIDB and it is used mainly for introducing fixes and changes only on bindings side. Whenever there is a fix or some change in the code that needs to be released however has nothing to do with the OSIDB itself, the PATCH version gets incremented. Eg. Bindings are on version 1.2.0, bug is found in the bindings code, bug gets fixed and bindings 1.2.1 are released.
Based on these rules, OSIDB and OSIDB bindings are always compatible as long as the MAJOR and MINOR version matches.
- OSIDB 1.2.0, bindings 1.2.0 - compatible
- OSIDB 1.2.0, bindings 1.2.1 - compatible
- OSIDB 1.2.2, bindings 1.2.1 - compatible
- OSIDB 1.3.0, bindings 1.2.1 - feature incomplete
- OSIDB 2.0.0, bindings 1.9.9 - incompatible
When minor changes needs to be done on the osidb-bindings side without the actual release of the OSIDB itself patch release is prefered.
-
Make your changes (see how to feature commit)
-
Clone/update master branch
$ git checkout master $ git pull
-
Start release script and follow instructions
$ make patch-release
This will:
- create a new branch
- increment the patch part of the version in all necessary places (eg. x.x.1 -> x.x.2)
- commit and push the changes
- open pull request creation in browser
-
Confirm PR creation opened by the relase script
-
Confirm checks passes
-
Merge PR
-
Create a new release and tag via GitHub WebUI - this will also trigger the build and upload to PyPI
- Tag and release needs be in format x.x.x to comply with semantic versioning
- Tag needs to point to the latest commit
- Release description should include the newest section of the CHANGELOG.md
When new major/minor OSIDB version is released, major/minor release of the osidb-bindings needs to be performed.
-
Clone/update master branch
$ git checkout master $ git pull
-
Start release script and follow instructions
$ make release
This will:
- compare latest version of OSIDB and bindings (safe check before release)
- download OpenAPI schema of latest OSIDB version
- create a new branch
- regenerate bindings
- replace version on all places with new OSIDB bindings version based on the latest OSIDB version
- commit and push the changes
- open pull request creation in browser
-
Confirm PR creation opened by the relase script
-
Confirm checks passes
-
Merge PR
-
Create a new release and tag via GitHub WebUI - this will also trigger the build and upload to PyPI
- Tag and release needs be in format x.x.x to comply with semantic versioning
- Tag needs to point to the latest commit
- Release description should include the newest section of the CHANGELOG.md
When new major/minor OSIDB version is being released, we can do a pre-release of the osidb-bindings so they are once the OSIDB is released.
-
Clone/update master branch
$ git checkout master $ git pull
-
Start release script and follow instructions
$ make pre-release
This will:
- compare latest OSIDB release branch version (release-x.x.x) and bindings version (safe check before release)
- download OpenAPI schema of latest OSIDB release branch
- create a new branch
- regenerate bindings
- replace version on all places with new OSIDB bindings version based on the latest OSIDB version
- commit and push the changes
- open pull request creation in browser
-
Confirm PR creation opened by the relase script
-
Confirm checks passes
-
Merge PR
-
Create a new release and tag via GitHub WebUI - this will also trigger the build and upload to PyPI
- Tag and release needs be in format x.x.x to comply with semantic versioning
- Tag needs to point to the latest commit
- Release description should include the newest section of the CHANGELOG.md
There is an RPM repository setup in Fedora COPR. RPMs are build automatically based on git tag events. When a git tag is added to the repository Fedora COPR runs the .copr/Makefile which builds an RPM for the target distributions (currently Fedora 39 and 40).