On 2024-03-06 16:11:55, Antoine Beaupré wrote:
> On 2024-03-06 22:06:18, Petter Reinholdtsen wrote:
>> [Antoine Beaupre]
>>> A CLI utility and Python library for interacting with Large Language
>>> Models, both via remote APIs and models that can be installed and run
>>> on your own machine.
>>
>> Is the option to run models locally related to llama.cpp, ref
>> <URL: https://bugs.debian.org/1063673 >?
>
> I am not absolutely sure, but yes, I would assume that's one of the
> backends llm can use.

For what it's worth, I have just checked again, and it's actually the
first plugin in the "plugin directory" here:

https://llm.datasette.io/en/stable/plugins/directory.html#plugin-directory

To get *that* plugin working, however, we'd also need to package a shim
library as well:

https://github.com/abetlen/llama-cpp-python

There's a whole forest of plugins with their own dependency chains in
there, I'd love to get some help in packaging those.
-- 
We all pay for life with death, so everything in between should be
free.
                         - Bill Hicks

Reply via email to