-
Notifications
You must be signed in to change notification settings - Fork 127
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
sci-misc/llama-cpp: add dependencies on curl and numpy
Signed-off-by: Alexey Korepanov <[email protected]>
- Loading branch information
Showing
2 changed files
with
18 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -21,13 +21,22 @@ HOMEPAGE="https://github.com/ggerganov/llama.cpp" | |
LICENSE="MIT" | ||
SLOT="0" | ||
CPU_FLAGS_X86=( avx avx2 f16c ) | ||
IUSE="curl" | ||
|
||
# curl is needed for pulling models from huggingface | ||
# numpy is used by convert_hf_to_gguf.py | ||
DEPEND="curl? ( net-misc/curl:= )" | ||
This comment has been minimized.
Sorry, something went wrong.
This comment has been minimized.
Sorry, something went wrong.
khumarahn
Author
Contributor
|
||
RDEPEND="${DEPEND} | ||
dev-python/numpy | ||
" | ||
|
||
src_configure() { | ||
local mycmakeargs=( | ||
-DLLAMA_BUILD_TESTS=OFF | ||
-DLLAMA_BUILD_SERVER=ON | ||
-DCMAKE_SKIP_BUILD_RPATH=ON | ||
-DGGML_NATIVE=0 # don't set march | ||
-DLLAMA_CURL=$(usex curl ON OFF) | ||
-DBUILD_NUMBER="1" | ||
) | ||
cmake_src_configure | ||
|
Does it call curl at compile phase to download something or it calls curl as part of running the program? In the first case it would break the sandbox and it would be better to find an alternative approach (adding this models to SRC_URI for example).