-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
dynamic crossenv + python3*-wheels + python310-313 updates #6282
Conversation
@hgy59 I'll let this run but this patch in particular 9c764a7 needs to be in its own PR... interesting from not having noticed it earlier. I'll also mark as [WIP] as not even close to be merged, this was early work being push to initiate exchange with python github relatively to the build issue I had. |
@hgy59
|
@th0ma7 I will remove this in #6269. It is obsolete since spksrc image on debian 12. |
@hgy59, @mreid-tt and other @SynoCommunity/developers I'd appreciate having some thoughts on this... I always kept within Doing so I thought why not building them all and include other remanants dispersed into other sub
Then I recall someone mentioning a while back ago, why not having our own wheel repository?
Thoughts on this would be much appreciated. Also on my TODO (to which babysteps may be best):
|
@th0ma7, I’m not very experienced with Python builds, so I had to familiarize myself with the basics (e.g., using Real Python's guide on wheels). Here are some initial thoughts:
I might be missing some key points due to my limited background, so I'd appreciate more details. For instance, you mentioned integrating other "remnants" scattered across sub SPKs. What exactly are these remnants, and what benefits would centralizing them bring? Are they also related to test code? Regarding the internal wheel repository, is it a matter of PyPI's offerings being insufficient? Are there commonly missing platform-specific wheels for Synology hardware? Would it be feasible to advocate for the Python Package Index to include the wheels we need, perhaps by reaching out to the relevant projects and requesting support? I might not fully grasp the situation, but I’m keen to understand the challenges better and contribute more meaningful suggestions. |
Indeed. Package could even be renamed similarly.
Not really. That would mostly be just a web page listing our wheels, that you can query through
Issue is, pypi provides tons of pre-compiled wheels but ppc, armv5, (and often armv7 and even aarch64) are always missing. Therefore, on our NAS at installation time Therefore when we build our spk package using our spksrc framework, we pull source packages from pypi usng
The challenge is that the python wheel build system is undergoing a lot of changes currently and impacting the overall wheel building approaches. As pypi modules documentation and maintenance varies from one to the next, this ends-up breaking things all over the place. So question is, would it be easier to have a single location to manage all wheel cross-compiling, providing numerous versions online to ease package management? I've been thinking of this a little more and still unsure, further as this would probably mean statically linking wheels so it is compatible with any installation OR using rpath so dependencies are available from that |
It makes sense to have our own repository. Although it will be quite a lot of work to implement and significantly add to maintenance burden.. |
bb787f5
to
a7a0997
Compare
@hgy59 I believe I may now have a feature-wise functional code ready for testing... I'll be away next week (SC24) so cycles until my return may be limited. Although I would much like your opinion on this invasive but in theory fully backward compatible code change. TL;DR;
Nice addition (from my perspective), ability to re-generate crossenv on demand (see description above for howto). Besides the non-blocker TODO items, in theory it should be ready for testing and shaking out bugs and/or adjusting proposed strategy. I'm mostly thinking of weird corner-cases, such as the ones found in homeassistant that I hope can be addressed with this (to be tested + wheel specific crossenv configurations to be created). Let me know if you have a moment to test-bed this, your pair of 👀 would be much appreciated. Lastly: I did not forgot you with your requirement to join ffmpeg + wheel cross-compiling. With the previous addition of EDIT: Looking at github-action output it seems there are still a few rough edges to look for... on my todo list. |
initial analysis: when building python311 for the first time, the
when you call make again, the but after call the the variables are set (and build finally succeeds):
|
some more details to the analysis above all lines from build log starting with
This shows that
the first has correct variables, but the second is missing the python version This might trigger you... |
@th0ma7 another idea I want to share When wheels couldn't be built without additional wheels in crossenv (like expandvars to build frozenlist), I was looking in the |
That would be elegant indeed, and this pr could set the stage as a start to move towards gainin more flexibility with wheel building. |
@hgy59 now fixed and ready for testing. I also migrated the code to make use of status cookie handling like other pieces of the framework which simplify things a lot. If by any chances you have a moment to look at py313 cross-compiling... they moved away from |
not yet testing... it fails to install For DSM6.2.4 it fails to install We must either
List of additional modules installed into crossenv (those are not listed in
|
@hgy59 question of managing expectations... This PR really still is a I was hoping that at this stage it could be tested to confirm wetter this suffice to resolve the immediate build failures we have and allow getting reproducibility back for some of our packages (
Interestingly it's only python 3.13 that fails, and I'm glad it only fails on this as I had other issues previously which now looks solved, at least thru github-action.
The
Currently testing that to see if this helps... Also, on my local branch cross/python313 currenly fails with the following but builds fine using
I'll have another look at it upon my return (feel free to push fixes if you hapen to have cycles).
I tried this... and it's really tricky as ordering of install is important. On the other hand that may ease passing But you're right, while downgrading
yup, due to the dependency chain as we need to provide all dependencies for cross-compiling to actually work. I find it handy to print the list of crossenv installed wheels to track exactly what is the build environment in use when cross-compiling. Lastly, my next step is to review the wheel building code to use status cookies and be closer to the remaining of the framework code. I want to divide it such as the following for a start:
So then it will become much easier to maintain and add extra functionalities as needed such as meson and cmake toolchain file support, and potentially automating |
Qoriq failure may be related to python/cpython#125269 |
- Simplified shell calls to avoid defining SHELL = /bin/bash - Added at ifneq ($(wildcard file-wheel file-default)) as at early makefile variable asignation path will not have been determined just yet, thus blocking when trying to determine default version values extracted from variable within requirement files - Fixed CROSSENV_CONFIG_PATH so it always refer back to $(PYTHON_WORK_DIR) to ensure it works both from spk/python3* and from python related spk projects called using spksrc.python.mk Co-Author: hgy59 <[email protected]>
Numpy fails to build starting with version >= 1.26. May require to be migrated to use cross/numpy with proper wheel building including meson toolchain file parameter passing.
Fails to build numpy 1.25.2 but ok with 1.25.1
Finally! All flags are green! I'll do one last round of validation before merging. As this pr is quite invasive there will probably be some oversight to be managed after merge. I'll also update our python wiki page. |
@th0ma7, amazing work! I guess it's time to remove the WIP from the title now ;-) EDIT: I downloaded one of the created archives and noted for python we have both |
thnx, but this is only the part one... but still a significant one. EDIT: And should mention, thnx to @hgy59 who helped out as well 👍
thnx for trying it out! to your question: no. python-wheels really are just wheel testing packages to confirm that most of the wheels gets properly built when updating the base python packages. That content used to be within the python makefile that i would enable/disable for full testing. having a standalone test package makes things much easier and cleans-up the base python makefile by a lot, even further by dissociating the crossenvironment portion like done in this PR. My hope is that python packages becomes much easier to maintain in the longer run. |
@th0ma7 thanks for your great work! What is the status of building wheels for python313? I ask because Homeassistant now supports python 3.13 and if wheel building works, I will use 3.13 for the next update. |
Great question! And the short answer is: drum-roll... I don't know just yet. Let's get this PR merged, and then I can certainly assist at enhancing the python-wheel front so we have one targeting python313 and confirm if changes are needed? |
@hgy59 and @mreid-tt I'm surprised by how much python related documentation was added over time as updates to the framework was done. I've added a few new sections in there that I believe covers the key highlights as well as gaps I found from other functionality already in place from earlier updates. I'd appreciate if you had a moment to review that wiki page https://github.com/SynoCommunity/spksrc/wiki/Using-wheels-to-distribute-Python-packages |
Lastly, now that this is merge I will be publishing python310-311. The key objective here was to finally return to a state where python packages are once-again reproducible using the already in place python versions. Before moving ahead with 312 or 313 I'd rather wait for having a first package migrated and related |
@th0ma7 thanks for all, this is a huge step forward to support python and dependent packages for Synology Devices. Two suggestions:
|
Thnx, and again thnx for your assistance. Still expecting there may be some quirks to be found as we use it.
TL;DR; I agree with both suggestions.
I did played once more with Pillow and this was partially why I was thinking of moving all cross related wheels to a python directory strutuctre and be using it for more complex wheels. To note, the I also noticed one particular behaviour: I'm simply unable to build Although one really important note is that we use to have a crosscompiling patch for python
Agreed. Moreover the
See comment above, why not making this part of wheel specific
Indeed but that is sufficiently easy to test out and believe it's simply is a minor detail or a incentive to enhance further the
No need to convince me on this, I'm on-board!
If I summarize, the things we may have in mind are, including previous exchanges:
That's a lot of work... work that can be splitted I guess? Throwing out ideas:
And wonder where we could track this list besides here? (which may evolve over time)... Have I missed anything? and @SynoCommunity/developers there is room for other if interested to participate in this effort... |
Description
Intent is to:
python310
/python311
baseMakefiles
to ease updatespython312
andpython313
packagescrossenv
creation frompython3*/Makefile
-> Now usesspksrc.crossenv.mk
crossenv
enablement using amk/crossenv/
directory containing wheel crossenv definitions:Fixes #6284
Checklist
all-supported
completed successfullyType of change
smallframework changesTODO
$HOME/.cache/pip
to use$(WORK_DIR)/pip
requirements.txt
entry with trailing commenttest==1234 # This is a test wheel
$(HOME)/.cache/pip
OPENSSL_*_DIR
variables and logic usage throughout python related mk filescrossenv
add status info tostatus-build.log
filespksrc.python-wheel.mk
to use status cookie to avoid always rebuilding -->> LEFT FOR SUBSEQUENT PRzlib
creation + remove of symlinks from spksrc.python.mk to eliminate rebuilding