[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[elpa] externals/llm efe1f89ae0: Extend key function to other models, ad
From: |
ELPA Syncer |
Subject: |
[elpa] externals/llm efe1f89ae0: Extend key function to other models, add to README (#114) |
Date: |
Fri, 29 Nov 2024 00:58:18 -0500 (EST) |
branch: externals/llm
commit efe1f89ae0328dabec1384c1213d1cb3b444b234
Author: Andrew Hyatt <ahyatt@gmail.com>
Commit: GitHub <noreply@github.com>
Extend key function to other models, add to README (#114)
Co-authored-by: Daniel Mendler <mail@daniel-mendler.de>
---
NEWS.org | 1 +
README.org | 11 +++++++++++
llm-azure.el | 4 +++-
llm-claude.el | 4 +++-
llm-gemini.el | 8 ++++++--
5 files changed, 24 insertions(+), 4 deletions(-)
diff --git a/NEWS.org b/NEWS.org
index c427b4f548..824f38a7c9 100644
--- a/NEWS.org
+++ b/NEWS.org
@@ -1,5 +1,6 @@
* Version 0.19.0
- Add JSON mode, for most providers with the exception of Claude.
+- Add ability for keys to be functions, thanks to Daniel Mendler.
* Version 0.18.1
- Fix extra argument in ~llm-batch-embeddings-async~.
* Version 0.18.0
diff --git a/README.org b/README.org
index c0a18b15c8..3efbbcf5c8 100644
--- a/README.org
+++ b/README.org
@@ -23,6 +23,17 @@ Users of an application that uses this package should not
need to install it the
Here ~my-openai-key~ would be a variable you set up before with your OpenAI
key. Or, just substitute the key itself as a string. It's important to remember
never to check your key into a public repository such as GitHub, because your
key must be kept private. Anyone with your key can use the API, and you will be
charged.
+You can also use a function as a key, so you can store your key in a secure
place and retrieve it via a function. For example, you could add a line to
=~/.authinfo.gpg=:
+
+#+begin_example
+machine llm.openai password <key>
+#+end_example
+
+And then set up your provider like:
+#+begin_src emacs-lisp
+(setq llm-refactoring-provider (make-llm-openai :key (plist-get (car
(auth-source-search :host "llm.openai")) :secret)))
+#+end_src
+
All of the providers (except for =llm-fake=), can also take default parameters
that will be used if they are not specified in the prompt. These are the same
parameters as appear in the prompt, but prefixed with =default-chat-=. So, for
example, if you find that you like Ollama to be less creative than the default,
you can create your provider like:
#+begin_src emacs-lisp
diff --git a/llm-azure.el b/llm-azure.el
index 5fa0440fd5..fa69c68d37 100644
--- a/llm-azure.el
+++ b/llm-azure.el
@@ -46,7 +46,9 @@
(llm-azure-embedding-model provider)))
(cl-defmethod llm-provider-headers ((provider llm-azure))
- `(("api-key" . ,(llm-azure-key provider))))
+ `(("api-key" . ,(if (functionp (llm-azure-key provider))
+ (funcall (llm-azure-key provider))
+ (llm-azure-key provider)))))
(cl-defmethod llm-capabilities ((_ llm-azure))
(list 'streaming 'embedding))
diff --git a/llm-claude.el b/llm-claude.el
index 9522f8b8c2..05a7bdf943 100644
--- a/llm-claude.el
+++ b/llm-claude.el
@@ -130,7 +130,9 @@
(funcall msg-receiver (assoc-default 'text
delta))))))))))
(cl-defmethod llm-provider-headers ((provider llm-claude))
- `(("x-api-key" . ,(llm-claude-key provider))
+ `(("x-api-key" . ,(if (functionp (llm-claude-key provider))
+ (funcall (llm-claude-key provider))
+ (llm-claude-key provider)))
("anthropic-version" . "2023-06-01")
("anthropic-beta" . "tools-2024-04-04")))
diff --git a/llm-gemini.el b/llm-gemini.el
index fd166bd190..44a126d6d6 100644
--- a/llm-gemini.el
+++ b/llm-gemini.el
@@ -47,7 +47,9 @@ You can get this at https://makersuite.google.com/app/apikey."
"Return the URL for the EMBEDDING request for STRING from PROVIDER."
(format
"https://generativelanguage.googleapis.com/v1beta/models/%s:embedContent?key=%s"
(llm-gemini-embedding-model provider)
- (llm-gemini-key provider)))
+ (if (functionp (llm-gemini-key provider))
+ (funcall (llm-claude-key provider))
+ (llm-gemini-key provider))))
(cl-defmethod llm-provider-embedding-request ((provider llm-gemini) string)
`((model . ,(llm-gemini-embedding-model provider))
@@ -63,7 +65,9 @@ If STREAMING-P is non-nil, use the streaming endpoint."
(format
"https://generativelanguage.googleapis.com/v1beta/models/%s:%s?key=%s"
(llm-gemini-chat-model provider)
(if streaming-p "streamGenerateContent" "generateContent")
- (llm-gemini-key provider)))
+ (if (functionp (llm-gemini-key provider))
+ (funcall (llm-gemini-key provider))
+ (llm-gemini-key provider))))
(cl-defmethod llm-provider-chat-url ((provider llm-gemini))
(llm-gemini--chat-url provider nil))
[Prev in Thread] |
Current Thread |
[Next in Thread] |
- [elpa] externals/llm efe1f89ae0: Extend key function to other models, add to README (#114),
ELPA Syncer <=