Skip to content

Commit

Permalink
BUG: llama-cpp-python 0.3.2 build issue in cpu Docker (#2613)
Browse files Browse the repository at this point in the history
  • Loading branch information
ChengjieLi28 authored Nov 29, 2024
1 parent eb8ddd4 commit 8dd5715
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion xinference/deploy/docker/cpu.Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ ARG PIP_INDEX=https://pypi.org/simple
RUN python -m pip install --upgrade -i "$PIP_INDEX" pip && \
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu && \
pip install -i "$PIP_INDEX" --upgrade-strategy only-if-needed -r /opt/inference/xinference/deploy/docker/requirements_cpu.txt && \
CMAKE_ARGS="-DGGML_BLAS=ON -DGGML_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python && \
pip install llama-cpp-python && \
cd /opt/inference && \
python setup.py build_web && \
git restore . && \
Expand Down

0 comments on commit 8dd5715

Please sign in to comment.