emacs-elpa-diffs
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[elpa] externals/ellama 367e86c456 2/5: Support interactive switch model


From: ELPA Syncer
Subject: [elpa] externals/ellama 367e86c456 2/5: Support interactive switch models on remote host
Date: Sun, 7 Jan 2024 18:57:58 -0500 (EST)

branch: externals/ellama
commit 367e86c4562a3dcb8db6fc4ac5214d7bca5738c3
Author: Sergey Kostyaev <sskostyaev@gmail.com>
Commit: Sergey Kostyaev <sskostyaev@gmail.com>

    Support interactive switch models on remote host
    
    Fixes #45
---
 NEWS.org  |  2 ++
 ellama.el | 13 ++++++++++---
 2 files changed, 12 insertions(+), 3 deletions(-)

diff --git a/NEWS.org b/NEWS.org
index 048c0f30d6..686b07c326 100644
--- a/NEWS.org
+++ b/NEWS.org
@@ -1,3 +1,5 @@
+* Version 0.5.4
+- Support interactive switch models on remote host.
 * Version 0.5.3
 - Support cancellation.
 * Version 0.5.2
diff --git a/ellama.el b/ellama.el
index 7576eb1ebe..34edbe9fdd 100644
--- a/ellama.el
+++ b/ellama.el
@@ -6,7 +6,7 @@
 ;; URL: http://github.com/s-kostyaev/ellama
 ;; Keywords: help local tools
 ;; Package-Requires: ((emacs "28.1") (llm "0.6.0") (spinner "1.7.4"))
-;; Version: 0.5.3
+;; Version: 0.5.4
 ;; SPDX-License-Identifier: GPL-3.0-or-later
 ;; Created: 8th Oct 2023
 
@@ -656,14 +656,21 @@ buffer."
 (defun ellama-get-ollama-local-model ()
   "Return llm provider for interactively selected ollama model."
   (interactive)
+  (declare-function llm-ollama-p "ext:llm-ollama")
+  (declare-function llm-ollama-host "ext:llm-ollama")
+  (declare-function llm-ollama-port "ext:llm-ollama")
   (let ((model-name
         (completing-read "Select ollama model: "
                          (mapcar (lambda (s)
                                    (car (split-string s)))
                                  (seq-drop
-                                  (process-lines ellama-ollama-binary "ls") 
1)))))
+                                  (process-lines ellama-ollama-binary "ls") 
1))))
+       (host (when (llm-ollama-p ellama-provider)
+               (llm-ollama-host ellama-provider)))
+       (port (when (llm-ollama-p ellama-provider)
+               (llm-ollama-port ellama-provider))))
     (make-llm-ollama
-     :chat-model model-name :embedding-model model-name)))
+     :chat-model model-name :embedding-model model-name :host host :port 
port)))
 
 ;;;###autoload
 (defun ellama-provider-select ()



reply via email to

[Prev in Thread] Current Thread [Next in Thread]