Hello everyone,

We are currently using Solr 8.7 facets search to render a server generated 
heatmap.
Our issue is the following :

Solr often return this kind of exception => 
"java.lang.IllegalArgumentException: Too many cells (743 x 261) for level 5 
shape 
Rect(minX=-18.610839843750004,maxX=13.996582031250002,minY=40.53050177574321,maxY=51.97134580885172)"

Here the facet json query was => 
{
  "coord": {
  "type": "heatmap",
  "field": "coord_fieldname",
  "geom": "[\"-18.610839843750004 40.53050177574321\" TO \"13.996582031250002 
51.97134580885172\"]",
  "distErrPct":0.025
   }
}

The same query works if we put the distErrPct at 0.03.

We found that these lines of code, in 
org.apache.lucene.spatial.prefix.HeatmapFacetCounter class were generating this 
exception:

  45   public static final int MAX_ROWS_OR_COLUMNS = (int) 
Math.sqrt(ArrayUtil.MAX_ARRAY_LENGTH);
  ......
 140 if (columns > MAX_ROWS_OR_COLUMNS || rows > MAX_ROWS_OR_COLUMNS || columns 
* rows > maxCells) {
 141       throw new IllegalArgumentException(
 142           "Too many cells ("
 143               + columns
 144               + " x "
 145               + rows
 146               + ") for level "
 147               + facetLevel
 148               + " shape "
 149               + inputRect);
 150     }

Althought we don"t really understand how these maximum rows and columns counts 
are calculated.
We would like to obtain maximum precision when calculating our heatmap.
Should we adapt distErrPct before each query in order to avoid this kind of 
response from Solr ? (if yes, how so ?)
Or is there something else we are missing ?
Because Solr succeed in processing this query for most of our "geom" rectangles 
in parameter, but it fails for some of them. The rectangles we send always have 
same proportions, just the "zoom" level differs.

Thanks for your answers !

Reply via email to