Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I cannot install this package with cuda12.4 #1933

Open
BoogonClothman opened this issue Feb 13, 2025 · 3 comments
Open

I cannot install this package with cuda12.4 #1933

BoogonClothman opened this issue Feb 13, 2025 · 3 comments

Comments

@BoogonClothman
Copy link

Details

  • Windows 11
  • NVIDIA CUDA 12.4
  • NVIDIA GeForce RTX 4060
  • Anaconda: Python 3.12.8

Terminal Output

PS E:\OwnProject\aivtuber> $env:CMAKE_ARGS="-DGGML_CUDA=on"
PS E:\OwnProject\aivtuber> pip install llama-cpp-python
Collecting llama-cpp-python
  Downloading llama_cpp_python-0.3.7.tar.gz (66.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 66.7/66.7 MB 11.1 MB/s eta 0:00:00
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: typing-extensions>=4.5.0 in e:\anaconda3\envs\brollo-py3128\lib\site-packages (from llama-cpp-python) (4.12.2)
Requirement already satisfied: numpy>=1.20.0 in e:\anaconda3\envs\brollo-py3128\lib\site-packages (from llama-cpp-python) (2.2.2)
Requirement already satisfied: diskcache>=5.6.1 in e:\anaconda3\envs\brollo-py3128\lib\site-packages (from llama-cpp-python) (5.6.3)
Requirement already satisfied: jinja2>=2.11.3 in e:\anaconda3\envs\brollo-py3128\lib\site-packages (from llama-cpp-python) (3.1.3)
Requirement already satisfied: MarkupSafe>=2.0 in e:\anaconda3\envs\brollo-py3128\lib\site-packages (from jinja2>=2.11.3->llama-cpp-python) (2.1.5)   
Building wheels for collected packages: llama-cpp-python
  Building wheel for llama-cpp-python (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [60 lines of output]
      *** scikit-build-core 0.10.7 using CMake 3.31.5 (wheel)
      *** Configuring CMake...
      2025-02-13 11:47:12,298 - scikit_build_core - WARNING - Can't find a Python library, got libdir=None, ldlibrary=None, multiarch=None, masd=None 
      loading initial cache file C:\Users\Pose2\AppData\Local\Temp\tmpnq4wylul\build\CMakeInit.txt
      -- Building for: Visual Studio 17 2022
      -- Selecting Windows SDK version 10.0.22621.0 to target Windows 10.0.26100.
      -- The C compiler identification is MSVC 19.43.34808.0
      -- The CXX compiler identification is MSVC 19.43.34808.0
      -- Detecting C compiler ABI info
      -- Detecting C compiler ABI info - done
      -- Check for working C compiler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.43.34808/bin/Hostx64/x64/cl.exe - skipped
      -- Detecting C compile features
      -- Detecting C compile features - done
      -- Detecting CXX compiler ABI info
      -- Detecting CXX compiler ABI info - done
      -- Check for working CXX compiler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.43.34808/bin/Hostx64/x64/cl.exe - skipped
      -- Detecting CXX compile features
      -- Detecting CXX compile features - done
      -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.45.1.windows.1")
      -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
      -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
      -- Looking for pthread_create in pthreads
      -- Looking for pthread_create in pthreads - not found
      -- Looking for pthread_create in pthread
      -- Looking for pthread_create in pthread - not found
      -- Found Threads: TRUE
      -- Warning: ccache not found - consider installing it for faster compilation or disable this warning with GGML_CCACHE=OFF
      -- CMAKE_SYSTEM_PROCESSOR: AMD64
      -- CMAKE_GENERATOR_PLATFORM: x64
      -- Including CPU backend
      -- Found OpenMP_C: -openmp (found version "2.0")
      -- Found OpenMP_CXX: -openmp (found version "2.0")
      -- Found OpenMP: TRUE (found version "2.0")
      -- x86 detected
      -- Performing Test HAS_AVX_1
      -- Performing Test HAS_AVX_1 - Success
      -- Performing Test HAS_AVX2_1
      -- Performing Test HAS_AVX2_1 - Success
      -- Performing Test HAS_FMA_1
      -- Performing Test HAS_FMA_1 - Success
      -- Performing Test HAS_AVX512_1
      -- Performing Test HAS_AVX512_1 - Failed
      -- Performing Test HAS_AVX512_2
      -- Performing Test HAS_AVX512_2 - Failed
      -- Adding CPU backend variant ggml-cpu: /arch:AVX2 GGML_AVX2;GGML_FMA;GGML_F16C
      -- Found CUDAToolkit: C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v12.4/include (found version "12.4.131")
      -- CUDA Toolkit found
      -- Using CUDA architectures: native
      CMake Error at C:/Program Files/CMake/share/cmake-3.31/Modules/CMakeDetermineCompilerId.cmake:614 (message):
        No CUDA toolset found.
      Call Stack (most recent call first):
        C:/Program Files/CMake/share/cmake-3.31/Modules/CMakeDetermineCompilerId.cmake:8 (CMAKE_DETERMINE_COMPILER_ID_BUILD)
        C:/Program Files/CMake/share/cmake-3.31/Modules/CMakeDetermineCompilerId.cmake:53 (__determine_compiler_id_test)
        C:/Program Files/CMake/share/cmake-3.31/Modules/CMakeDetermineCUDACompiler.cmake:131 (CMAKE_DETERMINE_COMPILER_ID)
        vendor/llama.cpp/ggml/src/ggml-cuda/CMakeLists.txt:25 (enable_language)


      -- Configuring incomplete, errors occurred!

      *** CMake configuration failed
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: Failed to build installable wheels for some pyproject.toml based projects (llama-cpp-python)

More

When I use command pip install llama-cpp-python --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cu124, there would be an unbelievable error with 583 lines of cmake information and I even cannot read it or put it here (out of lenght limit). I guess there might be some unicode error?

@BoogonClothman
Copy link
Author

BoogonClothman commented Feb 13, 2025

So, may I get some mistakes in CMake?

I successfully installed the pacakge by using wheel file downloaded from webpage -> (llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl) pip install path/to/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl

Whatever, I do not want to close my issue for I don't know why other ways didn't run.

@la1ty
Copy link

la1ty commented Feb 15, 2025

Probably a duplicate of #1917 .

You can also try my advice on building this wheel.

@BoogonClothman
Copy link
Author

Probably a duplicate of #1917 .

You can also try my advice on building this wheel.

Thanks a lot. I will try your advice later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants