Re: NEW emulators/llama.cpp b4589

2025-01-30 Thread Stuart Henderson
On 2025/01/30 10:03, Chris Cappuccio wrote: > Stuart Henderson [s...@spacehopper.org] wrote: > > > > I'd be happy with misc. If we end up with dozens of related ports then > > maybe a new category makes sense but misc seems to fit and is not over-full. > > Ok, here's a new spin for misc/llama.cpp

Re: NEW emulators/llama.cpp b4589

2025-01-30 Thread Stuart Henderson
On 2025/01/30 13:27, Dave Voutila wrote: > Stuart Henderson writes: > > > On 2025/01/30 08:15, Dave Voutila wrote: > >> > >> FWIW we should be able to include Vulkan support as its in ports. I've > >> played with llama.cpp locally with it, but I don't have a GPU that's > >> worth a damn top see i

Re: NEW emulators/llama.cpp b4589

2025-01-30 Thread Dave Voutila
Stuart Henderson writes: > On 2025/01/30 08:15, Dave Voutila wrote: >> >> FWIW we should be able to include Vulkan support as its in ports. I've >> played with llama.cpp locally with it, but I don't have a GPU that's >> worth a damn top see if it's an improvement over pure CPU-based >> inferencin

Re: NEW emulators/llama.cpp b4589

2025-01-30 Thread Chris Cappuccio
Stuart Henderson [s...@spacehopper.org] wrote: > > I'd be happy with misc. If we end up with dozens of related ports then > maybe a new category makes sense but misc seems to fit and is not over-full. Ok, here's a new spin for misc/llama.cpp with your patch applied. Using this model an AMD EPYC

Re: NEW emulators/llama.cpp b4589

2025-01-30 Thread Stuart Henderson
On 2025/01/30 08:15, Dave Voutila wrote: > > FWIW we should be able to include Vulkan support as its in ports. I've > played with llama.cpp locally with it, but I don't have a GPU that's > worth a damn top see if it's an improvement over pure CPU-based > inferencing. Makes sense, though I think i

Re: NEW emulators/llama.cpp b4589

2025-01-30 Thread Chris Cappuccio
Stuart Henderson [s...@spacehopper.org] wrote: > > I don't understand why it's in emulators. Perhaps misc would make sense? > I guess either misc or even a new category, like ml. Torch wuold come next, and there are plenty of other pieces that really don't fit in any other category except misc.

Re: NEW emulators/llama.cpp b4589

2025-01-30 Thread Dave Voutila
Stuart Henderson writes: > On 2025/01/29 21:12, Chris Cappuccio wrote: >> This is a simple port that works well out-of-the-box. It's kind of an >> emulator, >> I think. >> >> It has some optional python scripts that will need numpy and torch. >> >> I'm not sure the right way to handle the versio

Re: NEW emulators/llama.cpp b4589

2025-01-30 Thread Stuart Henderson
On 2025/01/29 21:12, Chris Cappuccio wrote: > This is a simple port that works well out-of-the-box. It's kind of an > emulator, > I think. > > It has some optional python scripts that will need numpy and torch. > > I'm not sure the right way to handle the version since it's "b4589" and not > 1.