emacs-elpa-diffs
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[elpa] externals/ellama ede32c5fb7 3/7: Add comment about user configura


From: ELPA Syncer
Subject: [elpa] externals/ellama ede32c5fb7 3/7: Add comment about user configuration
Date: Mon, 22 Jan 2024 15:57:50 -0500 (EST)

branch: externals/ellama
commit ede32c5fb78f58a6bc7a225a4f016235aa5363b7
Author: Sergey Kostyaev <sskostyaev@gmail.com>
Commit: Sergey Kostyaev <sskostyaev@gmail.com>

    Add comment about user configuration
---
 README.org | 26 ++++++++++++++------------
 1 file changed, 14 insertions(+), 12 deletions(-)

diff --git a/README.org b/README.org
index ab2df16cfc..55fecdced2 100644
--- a/README.org
+++ b/README.org
@@ -26,22 +26,24 @@ In that case you should customize ellama configuration like 
this:
     (setopt ellama-language "German")
     (require 'llm-ollama)
     (setopt ellama-provider
-                   (make-llm-ollama
-                    :chat-model "mistral:7b-instruct-v0.2-q6/K"
-                    :embedding-model "mistral:7b-instruct-v0.2-q6/K"))
+                    (make-llm-ollama
+                     ;; this model should be pulled to use it
+                     ;; value should be the same as you print in terminal 
during pull
+                     :chat-model "mistral:7b-instruct-v0.2-q6/K"
+                     :embedding-model "mistral:7b-instruct-v0.2-q6/K"))
     ;; Predefined llm providers for interactive switching.
     ;; You shouldn't add ollama providers here - it can be selected 
interactively
     ;; without it. It is just example.
     (setopt ellama-providers
-                   '(("zephyr" . (make-llm-ollama
-                                  :chat-model "zephyr:7b-beta-q6_K"
-                                  :embedding-model "zephyr:7b-beta-q6_K"))
-                     ("mistral" . (make-llm-ollama
-                                   :chat-model "mistral:7b-instruct-v0.2-q6_K"
-                                   :embedding-model 
"mistral:7b-instruct-v0.2-q6_K"))
-                     ("mixtral" . (make-llm-ollama
-                                   :chat-model 
"mixtral:8x7b-instruct-v0.1-q3/K/M-4k"
-                                   :embedding-model 
"mixtral:8x7b-instruct-v0.1-q3/K/M-4k")))))
+                    '(("zephyr" . (make-llm-ollama
+                                   :chat-model "zephyr:7b-beta-q6_K"
+                                   :embedding-model "zephyr:7b-beta-q6_K"))
+                      ("mistral" . (make-llm-ollama
+                                    :chat-model "mistral:7b-instruct-v0.2-q6_K"
+                                    :embedding-model 
"mistral:7b-instruct-v0.2-q6_K"))
+                      ("mixtral" . (make-llm-ollama
+                                    :chat-model 
"mixtral:8x7b-instruct-v0.1-q3/K/M-4k"
+                                    :embedding-model 
"mixtral:8x7b-instruct-v0.1-q3/K/M-4k")))))
 #+END_SRC
 
 ** Commands



reply via email to

[Prev in Thread] Current Thread [Next in Thread]