1:6121-alt1

- Update to b6121 (2025-08-08).
This commit is contained in:
Vitaly Chikunov 2025-08-09 02:46:37 +03:00
parent e6fba35f45
commit 50d28023b7
2 changed files with 6 additions and 2 deletions

View file

@ -11,7 +11,7 @@
%def_with vulkan
Name: llama.cpp
Version: 5753
Version: 6121
Release: alt1
Epoch: 1
Summary: LLM inference in C/C++
@ -160,6 +160,7 @@ export NVCC_PREPEND_FLAGS=-ccbin=g++-12
-DLLAMA_BUILD_TESTS=ON \
-DLLAMA_CURL=ON \
-DGGML_BACKEND_DL=ON \
-DGGML_BACKEND_DIR=%_libexecdir/llama \
-DGGML_CPU=ON \
-DGGML_RPC=ON \
%ifarch x86_64
@ -256,6 +257,9 @@ llama-cli -m %_datadir/tinyllamas/stories260K.gguf -p "Once upon a time" -s 55 -
%endif
%changelog
* Sat Aug 09 2025 Vitaly Chikunov <vt@altlinux.org> 1:6121-alt1
- Update to b6121 (2025-08-08).
* Wed Jun 25 2025 Vitaly Chikunov <vt@altlinux.org> 1:5753-alt1
- Update to b5753 (2025-06-24).
- Install an experimental rpc backend and server. The rpc code is a

View file

@ -1 +1 @@
73e53dc834c0a2336cd104473af6897197b96277 b5753
e54d41befcc1575f4c898c5ff4ef43970cead75f b6121