emacs-elpa-diffs
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[nongnu] elpa/gptel 1434bbac7b 122/273: gptel-ollama, gptel-openai: Add


From: ELPA Syncer
Subject: [nongnu] elpa/gptel 1434bbac7b 122/273: gptel-ollama, gptel-openai: Add example of backend creation
Date: Wed, 1 May 2024 10:02:11 -0400 (EDT)

branch: elpa/gptel
commit 1434bbac7b00e9ece0dd0190b8ca9ad21b34dba5
Author: Karthik Chikmagalur <karthikchikmagalur@gmail.com>
Commit: Karthik Chikmagalur <karthikchikmagalur@gmail.com>

    gptel-ollama, gptel-openai: Add example of backend creation
    
    README: Fix error with Ollama backend instructions
---
 README.org      | 11 +++++------
 gptel-ollama.el | 11 ++++++++++-
 gptel-openai.el | 24 ++++++++++++++++++++++--
 3 files changed, 37 insertions(+), 9 deletions(-)

diff --git a/README.org b/README.org
index 77259ef34f..7940af9b45 100644
--- a/README.org
+++ b/README.org
@@ -169,12 +169,11 @@ You can pick this backend from the transient menu when 
using gptel (see usage),
 
 Register a backend with
 #+begin_src emacs-lisp
-(defvar gptel--ollama
-  (gptel-make-ollama
-   "Ollama"                             ;Any name of your choosing
-   :host "localhost:11434"              ;Where it's running
-   :models '("mistral:latest")          ;Installed models
-   :stream t))                          ;Stream responses
+(gptel-make-ollama
+ "Ollama"                               ;Any name of your choosing
+ :host "localhost:11434"                ;Where it's running
+ :models '("mistral:latest")            ;Installed models
+ :stream t)                             ;Stream responses
 #+end_src
 These are the required parameters, refer to the documentation of 
=gptel-make-gpt4all= for more.
 
diff --git a/gptel-ollama.el b/gptel-ollama.el
index 2d81347412..f5e5fd1d51 100644
--- a/gptel-ollama.el
+++ b/gptel-ollama.el
@@ -113,7 +113,16 @@ alist, like:
 
 KEY (optional) is a variable whose value is the API key, or
 function that returns the key. This is typically not required for
-local models like Ollama."
+local models like Ollama.
+
+Example:
+-------
+
+(gptel-make-ollama
+  \"Ollama\"
+  :host \"localhost:11434\"
+  :models '(\"mistral:latest\")
+  :stream t)"
   (let ((backend (gptel--make-ollama
                   :name name
                   :host host
diff --git a/gptel-openai.el b/gptel-openai.el
index 316efc470e..b8bcd42ed9 100644
--- a/gptel-openai.el
+++ b/gptel-openai.el
@@ -166,7 +166,18 @@ alist, like:
 ((\"Content-Type\" . \"application/json\"))
 
 KEY (optional) is a variable whose value is the API key, or
-function that returns the key."
+function that returns the key.
+
+Example:
+-------
+
+(gptel-make-azure
+ \"Azure-1\"
+ :protocol \"https\"
+ :host \"YOUR_RESOURCE_NAME.openai.azure.com\"
+ :endpoint 
\"/openai/deployments/YOUR_DEPLOYMENT_NAME/completions?api-version=2023-05-15\"
+ :stream t
+ :models '(\"gpt-3.5-turbo\" \"gpt-4\"))"
   (let ((backend (gptel--make-openai
                   :name name
                   :host host
@@ -210,7 +221,16 @@ alist, like:
 
 KEY (optional) is a variable whose value is the API key, or
 function that returns the key. This is typically not required for
-local models like GPT4All.")
+local models like GPT4All.
+
+Example:
+-------
+
+(gptel-make-gpt4all
+ \"GPT4All\"
+ :protocol \"http\"
+ :host \"localhost:4891\"
+ :models '(\"mistral-7b-openorca.Q4_0.gguf\"))")
 
 (provide 'gptel-openai)
 ;;; gptel-backends.el ends here



reply via email to

[Prev in Thread] Current Thread [Next in Thread]