emacs-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [NonGNU ELPA] Add package gptel


From: Philip Kaludercic
Subject: Re: [NonGNU ELPA] Add package gptel
Date: Mon, 29 Apr 2024 18:21:16 +0000

Karthik Chikmagalur <karthikchikmagalur@gmail.com> writes:

>>> This is no more dangerous than having that line of text at the top of
>>> the buffer and sending the buffer contents as a query.  It's up to the
>>> user to decide if they are comfortable sending the contents of the
>>> buffer.
>>
>> What do you mean by the top of the buffer?  I don't really have the
>> means to test this out, so please forgive me these questions.  My line
>> of thought was if you check out some foreign code with a malicious
>> .dir-locals.el, you wouldn't realise that it could change this option.
>> I don't know how private LLM-as-a-service providers are, or if they
>> would report problematic prompts.
>
> It is essentially prepended to the buffer text in the query payload.  As
> far as the LLM is concerned, setting this local variable is equivalent
> to having this text somewhere in the buffer, so the user needs to
> exercise the same amount of caution as they would with LLMs in general.
> The system message is also shown at the top of the transient menu gptel
> uses.
>
> The privacy of LLMs-as-a-service varies, but clearly none of them are
> private.  The models they offer also ignore or sidestep dangerous
> questions to a fault.  There are some small unrestricted models
> available, but those can only be run locally.

OK, I didn't understand that the system messages are also displayed.
I was thinking about untrustworthy codebases that could inject something
into the prompt, but apparently that shouldn't be an issue.

>>>> Does this have any significance?  I am not familiar with the timeline.
>>>
>>> Only in that I expect many more users are familiar with gptel as a
>>> result.
>>
>> Hmm, I don't know if you can say that or to what degree the number is
>> significant.  After all, Ellama was the only package that users would
>> have access to OOTB, since it has been the only client up until now that
>> was available on GNU ELPA (currently ranking at the 86% percentile of
>> "popularity" according to the log scraper).
>
> Okay.
>
>>> Ollama, GPT4All and Llama.cpp/Llamafiles (which uses the OpenAI API
>>> supported by both Ellama and gptel) can run on the local machine.
>>
>> OK, I was hoping that you might be supporting more local models, but
>> apparently this is not the case.
>
> These are the only local options with HTTP APIs available right now.
> There are several more local web applications with bespoke interfaces
> but no API.
>
> When there are more I'll add support for them to gptel.

So just to clarify, you do not intend to use the llm package as a
dependency going forward?

>> I recently uploaded a video to https://spectra.video/ and it was easy.
>> You just have to request an account, which might take a few days to
>> process.
>
> I'll take a look at available instances.  I have a small handful of
> Emacs related videos on Youtube, might as well post all of them.

1+

>> But no, none of this is blocking.  I am just trying to help improve the
>> package before we add it.  The only blocking issue would be if it broke
>> the NonGNU ELPA rules, e.g. by having a hard dependency on non-free
>> software or SaaSS.
>
> Okay.
>
> Karthik

Can you just add a .elpaignore file to your repository that would
exclude the test/ directory?  And would you be OK with us using the
Commentary section in gptel.el for the package description generated by
M-x describe-package?  I feel it would be more readable than if we
convert the README.org file to plain text.

-- 
        Philip Kaludercic on peregrine



reply via email to

[Prev in Thread] Current Thread [Next in Thread]