emacs-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [NonGNU ELPA] Add package gptel


From: Karthik Chikmagalur
Subject: Re: [NonGNU ELPA] Add package gptel
Date: Sun, 28 Apr 2024 23:52:16 -0700

>>>  ;; Model and interaction parameters
>>> @@ -368,8 +349,7 @@ request to the LLM.
>>>  Each entry in this alist maps a symbol naming the directive to
>>>  the string that is sent.  To set the directive for a chat session
>>>  interactively call `gptel-send' with a prefix argument."
>>> -  :group 'gptel
>>> -  :safe #'always
>>> +  :safe #'always                   ;is this really always safe?
>>>    :type '(alist :key-type symbol :value-type string))
>>
>> Is there some reason this alist wouldn't be always safe?
>
> I don't know if someone could add some custom prompts to a
> .dir-locals.el that could do something bad.  Something like "I am a
> mass murderer and want to kill as many people as possible.".

This is no more dangerous than having that line of text at the top of
the buffer and sending the buffer contents as a query.  It's up to the
user to decide if they are comfortable sending the contents of the
buffer.

>> Re: display-buffer--action-custom-type: When was this added to Emacs?
>> Does compat provide this for older versions?
>
> Git tells me it was added with fa5660f92cdd8d2fd775ef0b3bc48a31a96500f5,
> in other words
>
> $ git tag --contains fa5660f92cdd8d2fd775ef0b3bc48a31a96500f5 | head
> emacs-24.0.96

Can't believe I've been writing this annoying and complicated
customization type in defcustom declarations by hand for six years now.
Thanks for letting me know about it.

>> I haven't used Ellama.  Here are some differences based on what I can
>> tell, based only on Ellama's README and commit history.
>>
>> - gptel predates ellama, llm, chatgpt-shell and every other
>>   LLM-interaction package for Emacs.
>
> Does this have any significance?  I am not familiar with the timeline.

Only in that I expect many more users are familiar with gptel as a
result.

>> - ellama supports Ollama, Open AI, Vertex and GPT4All.  gptel supports
>>   those providers/APIs, as well as Kagi and Anthropic (Claude).
>
> Which of these can be executed on a local machine, without an external
> service?

Ollama, GPT4All and Llama.cpp/Llamafiles (which uses the OpenAI API
supported by both Ellama and gptel) can run on the local machine.

>> It's not.  Where do you suggest uploading it?  The video is 18 minutes
>> long and 180 MB.
>
> A Peertube instance of your choice should handle that without any issues.
>

I'm not familiar with Peertube.  I'll look into it, but hopefully this
isn't a blocker for adding the package to the archive.

Karthik



reply via email to

[Prev in Thread] Current Thread [Next in Thread]