[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[elpa] externals/llm 3919b77383 06/34: Implement confusion and typos in
From: |
Andrew Hyatt |
Subject: |
[elpa] externals/llm 3919b77383 06/34: Implement confusion and typos in README.org |
Date: |
Sat, 16 Sep 2023 01:32:47 -0400 (EDT) |
branch: externals/llm
commit 3919b77383324173dcff352c506112fee903a646
Author: Andrew Hyatt <ahyatt@gmail.com>
Commit: Andrew Hyatt <ahyatt@gmail.com>
Implement confusion and typos in README.org
This fixes the problems noted in https://github.com/ahyatt/llm/pull/1 by
https://github.com/tvraman.
---
README.org | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/README.org b/README.org
index ee764e5d80..dea73f1a66 100644
--- a/README.org
+++ b/README.org
@@ -1,8 +1,8 @@
#+TITLE: llm package for emacs
-This is a library for interfacing with Large Language Models. It allows elisp
code to use LLMs, but gives the user an option to choose which LLM they would
prefer. This is especially useful for LLMs, since there are various
high-quality ones that in which API access costs money, as well as locally
installed ones that are free, but of medium quality. Applications using LLMs
can use this library to make sure their application works regardless of whether
the user has a local LLM or is p [...]
+This is a library for interfacing with Large Language Models. It allows elisp
code to use LLMs, but allows gives the end-user an option to choose which LLM
they would prefer. This is especially useful for LLMs, since there are various
high-quality ones that in which API access costs money, as well as locally
installed ones that are free, but of medium quality. Applications using LLMs
can use this library to make sure their application works regardless of whether
the user has a local [...]
-The functionality supported by LLMs is not completely consistent, nor are
their APIs. In this library we attempt to abstract functionality to a higher
level, because sometimes those higher level concepts are supported by an API,
and othertimes they must be put in more low-level concepts. Examples are an
example of this; the GCloud Vertex API has an explicit API for examples, but
for Open AI's API, examples must be specified by modifying the sytem prompt.
And Open AI has the concept of [...]
+The functionality supported by LLMs is not completely consistent, nor are
their APIs. In this library we attempt to abstract functionality to a higher
level, because sometimes those higher level concepts are supported by an API,
and othertimes they must be put in more low-level concepts. One such
higher-level concept is "examples" where the client can show example
interactions to demonstrate a pattern for the LLM. The GCloud Vertex API has
an explicit API for examples, but for Open AI [...]
Some functionality may not be supported by LLMs. Any unsupported
functionality with throw a ='not-implemented= signal.
- [elpa] externals/llm 9057a50df4 11/34: Fix indenting in llm--run-async-as-sync, (continued)
- [elpa] externals/llm 9057a50df4 11/34: Fix indenting in llm--run-async-as-sync, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm c322577b9b 13/34: Test both sync and async commands, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm cff9ab8f3c 22/34: Centralize nonfree llm warnings, and warn with a targeted type, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm ad230d9d6b 10/34: Add methods for nil provider, to throw more meaningful errors, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm 650bba65d5 25/34: Improve the docstring for llm--warn-on-nonfree, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm b2f1605514 33/34: Delete some trailing whitespace, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm 39ae6fc794 34/34: Assign copyright to FSF, in preparation of inclusion to GNU ELPA, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm 9a3fc01cac 17/34: Switch from generic to per-provider sync solution, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm eba797b295 04/34: Implement error handling for gcloud auth issues, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm 16ee85fd11 05/34: Add async options, and made the sync options just use those and wait, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm 3919b77383 06/34: Implement confusion and typos in README.org,
Andrew Hyatt <=
- [elpa] externals/llm b52958757a 18/34: Fix docstring wider than 80 characters in llm-vertex, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm abbff2aa9d 23/34: Change method name to llm-chat (without "-response"), update README, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm e94bc937c7 27/34: Fix issue with llm-chat before method having too many arguments, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm 7edd36b2dc 28/34: Fix obsolete or incorrect function calls in llm-fake, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm d4bbe9d84c 29/34: Fix incorrect requires in openai and vertex implementations, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm 723c0b3786 31/34: Minor README whitespace and formatting fixes, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm 8f30feb5c1 32/34: README improvements, including noting the nonfree llm warning, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm 444850a981 24/34: Fix missing word in non-free warning message, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm 0ed280c208 15/34: Add llm-fake, useful for developer testing using the llm methods, Andrew Hyatt, 2023/09/16
- [elpa] externals/llm c55ccf157a 03/34: Clean up package specifications in elisp files, Andrew Hyatt, 2023/09/16