Hi,
Here's the answer in the log
.t&clean=true&r
.9{deleteByQuery=*:* (-17845900
On Thu, Dec 7, 2023 at 6:42 AM Vince McMahon
wrote:
> Hi,
>
> I have indexed *915000 * documents into solr. I double check with my
> database, and its the same number of lines/documents. The solr query co
Thanks for showing me where to look from the log. It is super useful.
thank you so much. love it.
I Index again with Full Import with
&clean=false
&commit=true
It suggests
(264013 adds)
and again count is 5000 documents
/select params={q=*:*&indent=true&q.op=OR&rows=0&_=1701960293895} hits=5000
On 12/7/23 07:56, Vince McMahon wrote:
{
"responseHeader": {
"status": 0,
"QTime": 0
},
"initArgs": [
"defaults",
[
"config",
"db-data-config.xml"
]
],
"command": "status",
"status": "idle",
"importResponse": "",
"statusMessages": {
Thanks, Shawn.
DIH full-import, by itself works very well. It is bummer that my
incremental load itself is into millions. When specifying batchSize on
data source, the delta-import will honor that batch size once, for the
first fetch, then will loop the rest by hundreds per sec. That doesn't
he
Hi Vince,
It shouldn’t take too much time to write a simple loop in your favorite
language which fetches rows from the db and sends them to Solr over http to
/update handler. Imo It’s easier than trying to figure out DIH’s
particularities. Especially in the future, if you need to modify the doc