Huji wrote:
> Even free, lightweight LLMs (like LLaMa) could be helpful
LLaMa itself is not under a free license. Let's call it an "almost-free
license".
So, I'm not sure if it would be acceptable to run it, given the requisite
that
> All code in the Tools project must be published under an OSI ap
On 1/10/25 05:21, Huji Lee wrote:
Hi all,
Are there any LLMs available on Cloud services, or are there any plans for them?
I think there are many possible use cases. Even free, lightweight LLMs (like
LLaMa) could be helpful, e.g. in bots that review edits, categorize pages, etc.
Hi Huji,
Since it's now possible in Toolforge to expose services at custom ports[0],
I think it should already be possible for someone to host an LLM service
for other tools to use.
I could be wrong though, as LLMs might also require significant memory/CPU
resources and/or system software not available to
The OAuth identity endpoint (the Special:OAuth/identify special page for
OAuth 1, the oauth2/resource/profile REST API endpoint for OAuth 2) used to
return an incorrectly formatted JSON web token, where value of the 'sub' field
(the user's CentralAuth central user ID) was an integer, rather than a
This didn't happen! Further testing suggests that my original assertion
("now reaching public IPs works as expected") is not as true as I
thought it was.
On 1/6/25 9:26 AM, Andrew Bogott wrote:
Tl;dr:
Please let me know if you encounter changes in network connectivity
between cloud-vps host
> A phabricator ticket would also be useful, to help inform future actions.
> Would
> you mind creating one?
We have https://phabricator.wikimedia.org/T336905 that seems to cover
the same topic.
--
Francesco Negri (he/him) -- IRC: dhinus
Site Reliability Engineer, Cloud Services team
Wikimedia