[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
08/14: gnu: llama-cpp: Produce a portable binary unless tuned.
From: |
guix-commits |
Subject: |
08/14: gnu: llama-cpp: Produce a portable binary unless tuned. |
Date: |
Fri, 5 Apr 2024 12:27:22 -0400 (EDT) |
civodul pushed a commit to branch master
in repository guix.
commit 560d5c66925970a5104a46051ae74e2dbd16adba
Author: Ludovic Courtès <ludo@gnu.org>
AuthorDate: Fri Apr 5 17:48:57 2024 +0200
gnu: llama-cpp: Produce a portable binary unless tuned.
* gnu/packages/machine-learning.scm (llama-cpp)[arguments]:
Augment #:configure-flags.
[properties]: New field.
Co-authored-by: John Fremlin <john@fremlin.org>
Change-Id: I9b3d72849107a6988fec94dc4a22614443338cb2
---
gnu/packages/machine-learning.scm | 13 +++++++++++--
1 file changed, 11 insertions(+), 2 deletions(-)
diff --git a/gnu/packages/machine-learning.scm
b/gnu/packages/machine-learning.scm
index e61299a5db..47989b129f 100644
--- a/gnu/packages/machine-learning.scm
+++ b/gnu/packages/machine-learning.scm
@@ -541,8 +541,16 @@ Performance is achieved by using the LLVM JIT compiler.")
(build-system cmake-build-system)
(arguments
(list
- #:configure-flags
- '(list "-DLLAMA_BLAS=ON" "-DLLAMA_BLAS_VENDOR=OpenBLAS")
+ #:configure-flags #~'("-DLLAMA_BLAS=ON"
+ "-DLLAMA_BLAS_VENDOR=OpenBLAS"
+
+ "-DLLAMA_NATIVE=OFF" ;no '-march=native'
+ "-DLLAMA_FMA=OFF" ;and no '-mfma', etc.
+ "-DLLAMA_AVX2=OFF"
+ "-DLLAMA_AVX512=OFF"
+ "-DLLAMA_AVX512_VBMI=OFF"
+ "-DLLAMA_AVX512_VNNI=OFF")
+
#:modules '((ice-9 textual-ports)
(guix build utils)
((guix build python-build-system) #:prefix python:)
@@ -580,6 +588,7 @@ Performance is achieved by using the LLVM JIT compiler.")
(native-inputs (list pkg-config))
(propagated-inputs
(list python-numpy python-pytorch python-sentencepiece openblas))
+ (properties '((tunable? . #true))) ;use AVX512, FMA, etc. when available
(home-page "https://github.com/ggerganov/llama.cpp")
(synopsis "Port of Facebook's LLaMA model in C/C++")
(description "This package provides a port to Facebook's LLaMA collection
- 03/14: build-system/channel: Add support for additional channels., (continued)
- 03/14: build-system/channel: Add support for additional channels., guix-commits, 2024/04/05
- 04/14: gnu: guix: Define ‘guix-for-channels’ and document its use., guix-commits, 2024/04/05
- 06/14: gnu: Add tree-sitter-latex., guix-commits, 2024/04/05
- 07/14: gnu: Add tree-sitter-nix., guix-commits, 2024/04/05
- 09/14: gnu: atop: Update source hash., guix-commits, 2024/04/05
- 12/14: gnu: wm: Update swayfx to 0.3.2., guix-commits, 2024/04/05
- 01/14: guix-install.sh: Add ‘--uninstall’ flag., guix-commits, 2024/04/05
- 02/14: gnu: kokkos: Update to 4.2.01., guix-commits, 2024/04/05
- 11/14: gnu: a2ps: Update to 4.15.6, guix-commits, 2024/04/05
- 14/14: gnu: openssh: Update to 9.7p1., guix-commits, 2024/04/05
- 08/14: gnu: llama-cpp: Produce a portable binary unless tuned.,
guix-commits <=
- 05/14: gnu: Add tree-sitter-tlaplus., guix-commits, 2024/04/05
- 10/14: gnu: signify: Update to 32., guix-commits, 2024/04/05
- 13/14: gnu: labwc: Update to 0.7.1., guix-commits, 2024/04/05