emacs-elpa-diffs
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[elpa] externals/llm 806a703b91 2/2: Change version to 0.17.1 (#60)


From: ELPA Syncer
Subject: [elpa] externals/llm 806a703b91 2/2: Change version to 0.17.1 (#60)
Date: Sat, 3 Aug 2024 21:58:16 -0400 (EDT)

branch: externals/llm
commit 806a703b91580c1c34d3bb5fe1d556b659f89bcf
Author: Andrew Hyatt <ahyatt@gmail.com>
Commit: GitHub <noreply@github.com>

    Change version to 0.17.1 (#60)
    
    Remove the mistaken 0.17.2 from NEWS, we never released 0.17.1, so put
    everything in that release.
---
 NEWS.org | 5 ++---
 llm.el   | 2 +-
 2 files changed, 3 insertions(+), 4 deletions(-)

diff --git a/NEWS.org b/NEWS.org
index 32595214f9..0024ad9359 100644
--- a/NEWS.org
+++ b/NEWS.org
@@ -1,11 +1,10 @@
-* Version 0.17.2
-- Fix compiled functions not being evaluated in =llm-prompt=.
-- Use Ollama's new =embed= API instead of the obsolete one.
 * Version 0.17.1
 - Support Ollama function calling, for models which support it.
 - Make sure every model, even unknown models, return some value for 
~llm-chat-token-limit~.
 - Add token count for llama3.1 model.
 - Make =llm-capabilities= work model-by-model for embeddings and functions
+- Fix compiled functions not being evaluated in =llm-prompt=.
+- Use Ollama's new =embed= API instead of the obsolete one.
 * Version 0.17.0
 - Introduced =llm-prompt= for prompt management and creation from generators.
 - Removed Gemini and Vertex token counting, because =llm-prompt= uses token
diff --git a/llm.el b/llm.el
index 8d66fb23e0..2a378d4bc1 100644
--- a/llm.el
+++ b/llm.el
@@ -5,7 +5,7 @@
 ;; Author: Andrew Hyatt <ahyatt@gmail.com>
 ;; Homepage: https://github.com/ahyatt/llm
 ;; Package-Requires: ((emacs "28.1") (plz "0.8"))
-;; Package-Version: 0.17.0
+;; Package-Version: 0.17.1
 ;; SPDX-License-Identifier: GPL-3.0-or-later
 ;;
 ;; This program is free software; you can redistribute it and/or



reply via email to

[Prev in Thread] Current Thread [Next in Thread]