[Cloud] Re: LLM services

2025-01-12 Thread Mounir Afifi
Hello, > it would be useful to know particular use cases and needs before we can work on any implementation. I have an example in mind which I have been thinking about for a while now: I have a bot task that creates category trees and adds the leaf categories to pages on arywiki. Currently, I hav

[Cloud] Re: LLM services

2025-01-10 Thread Platonides
Huji wrote: > Even free, lightweight LLMs (like LLaMa) could be helpful LLaMa itself is not under a free license. Let's call it an "almost-free license". So, I'm not sure if it would be acceptable to run it, given the requisite that > All code in the Tools project must be published under an OSI ap

[Cloud] Re: LLM services

2025-01-10 Thread Siddharth VP
Since it's now possible in Toolforge to expose services at custom ports[0], I think it should already be possible for someone to host an LLM service for other tools to use. I could be wrong though, as LLMs might also require significant memory/CPU resources and/or system software not available to

[Cloud] Re: LLM services

2025-01-10 Thread Francesco Negri
> A phabricator ticket would also be useful, to help inform future actions. > Would > you mind creating one? We have https://phabricator.wikimedia.org/T336905 that seems to cover the same topic. -- Francesco Negri (he/him) -- IRC: dhinus Site Reliability Engineer, Cloud Services team Wikimedia

[Cloud] Re: LLM services

2025-01-10 Thread Arturo Borrero Gonzalez
On 1/10/25 05:21, Huji Lee wrote: Hi all, Are there any LLMs available on Cloud services, or are there any plans for them? I think there are many possible use cases. Even free, lightweight LLMs (like LLaMa) could be helpful, e.g. in bots that review edits, categorize pages, etc. Hi Huji,