Our system resources are:
OS (as a docker) has 4cpu and 32GB RAM, and we gave Solr 12GB java heap.
If I understand you correctly this situation is not like what you had @Gaikwad,
correct? (We should also have enough physical memory for all of our containers
without getting into a problem).
Sent
Hello,
I need to configure 5 analyzers for 5 different languages. For each one a
Synonyms Graph Filter will be configured with a large synonyms file (about
6 Mb). Since it is not recommended to store large amount of data in
zookeeper, is there a way to store the synonyms outside zookeeper ? Maybe
Hi,
I'm not familiar with the German analysis chain options, but perhaps you
can use a Copyfield at index time to create a new normalised text field
with a different analysis chain that strips umlauts and other modifiers,
so all types of 'a' become just plain 'a'. You could then use this field
A trick is to put that file in
/var/solr/data/my-Collection/conf/myBigDictionary.txt and remove it from
zookeeper
Then the ResourceLoader will/should fallback to local file system.
Would be a nice feature to be able to upload a configSet with "bin/solr
upconfig" and that large files would autom
Solr version is 8.4
I'm trying to use the export handler through SolrJ:
CloudSolrClient cloudSolrClient = ...
SolrQuery q = new SolrQuery();
q.setParam("q", "ts:[1612368422911 TO 1612370422911]");
q.setParam("sort", "ts asc");
q.setParam("fl", "ts");
q.setRequestHandler("/export");
cloudSolrCli
I need your help ...
In SOLR 7.5 I was able to get hold of JAVA objects by using "Class.forName"/
"getClass.forName":
#set($sysEnv=$engine.getClass.forName('java.lang.System'))
#set($sysEnv=$engine.class.forName('java.lang.System'))
In SOLR 8.11.1 this doesn't work any more and I don't know why.