Hello everyone,

I want to try out the LLM-based "Text-to-Vector" feature which was introduced 
in Solr 9.8, but I’ve discovered that there is no support for using self-hosted 
LLMs with Ollama yet.
As I am fairly new to Solr contribution, I want to check out if this issue has 
been discussed earlier and if I am missing anything regarding the support for 
Ollama as an LLM provider (which is already supported by langchain4j itself 
AFAIK), before I open up a Jira ticket.

Best regards
- Max

Reply via email to