llama.cpp-cuda (1-8659-alt1)

Published 2026-04-07 07:05:51 +03:00 by thek0tyara

Installation

# Add a repository to the list of connected repositories (choose the necessary architecture instead of "_arch_"):
apt-repo add rpm  _arch_ classic
# To install the package, run the following command:
apt-get update
apt-get install llama.cpp-cuda

Repository info

Architectures
x86_64

About this package

llama.cpp backend for NVIDIA GPU
llama.cpp backend for NVIDIA GPU.
Details
ALT
2026-04-07 07:05:51 +03:00
0
MIT
31 MiB
Assets (1)
Versions (1) View all
1-8659-alt1 2026-04-07