> Whoa ... that is a whole lot more data than I'm used to seeing in
TopMemoryContext. How many stats dump lines are there exactly (from
here to the crash report)? 

OK, I didn't know that was a surprise. There are about 600 stats dump lines 
in between.


>> The spatial database that the script is using is quite large (about 4
>> GB). So I think making a self-contained test case would be the last
>> resort.

>FWIW, I doubt that the content of the database is the key point here;
you could probably generate a test case with relatively little data,
or maybe a lot of easily-created dummy data. However stripping it down
might require more insight into the nature of the bug than we have at
this point.

I did a test in a small area (which is the street network in a county) before 
and it worked without crashing the server. In that test there were about 600 
records (or addresses) to be processed while in the current case there are 
about 12K records. Another difference is that the current case uses a larger 
base relation (the one I mentioned my previous email) that covers a whole 
state. I'm not sure whether it is the amount of records to be processed or the 
size of base relation that causes the crash.

_______________________________________________
Join Excite! - http://www.excite.com
The most personalized portal on the Web!



---------------------------(end of broadcast)---------------------------
TIP 4: Have you searched our list archives?

               http://archives.postgresql.org/

Reply via email to