-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Server build with python BE failing due to missing Boost lib #7925
Comments
Actually seeing this error as well:
coming from here: ExternalProject_Add( |
I manually did wget and verified - it is matching with the HASH in the CMake file. |
Hi @buddhapuneeth , thanks for reaching out. We'll take a look at it. |
This may be relevant: https://www.boost.org/users/news/boost_has_moved_downloads_to_jfr.html I reach the following webpage when I click on https://boostorg.jfrog.io/artifactory/main/release/: |
Okay, it looks like this was fixed in a newer version: #6775 I think your error might be because of the fact that you are pulling boost in python backend version 23.12, which does not have this fix from early 2024: https://github.com/triton-inference-server/python_backend/pull/334/files If you specifically need 23.12, the problem is that you need to apply this change while still being able to build via the build.py script. If I am reading it correctly, you need to basically clone all of the relevant repos in https://github.com/triton-inference-server to your own repository (which can be private), make the change to branch r23.12 of the python backend within that org, and se the --github-organization flag in build.py to point to your new organization. That sounds incredibly tedious, though. Perhaps a better way is simply to fork just the python backend in your own github organization, apply that patch to branch r23.12 on that fork, and then add an Lines 2993 to 2994 in b84cc26
|
Do the main triton inference server developers have any better suggestions? |
Description
server build is failing with error:
Triton Information
What version of Triton are you using?
TRITON_VERSION=v2.43.0
TRITON_SERVER_REPO=https://github.com/triton-inference-server/server.git
To Reproduce
git clone --depth 1 --branch ${TRITON_VERSION} ${TRITON_SERVER_REPO}
cd server && ./build.py -v --backend python:r23.12 --backend tensorrt:r23.12
--backend ensemble --no-container-build --no-container-interactive --build-dir=
pwd
/build--install-dir /opt/tritonserver
--enable-logging --enable-stats --enable-gpu
--enable-metrics --enable-cpu-metrics
--endpoint=grpc --endpoint=http &&
rm -rf build && cd ..
Expected behavior
Build should run successfully.
The text was updated successfully, but these errors were encountered: