The point is that sequentially the GC gets to remove stale entries so
simplistically only 3000 records are in memory at any one time, in
parallel processing all 9 can be in memory at the same time.
Sent from my iPad
On 6 Aug 2011, at 21:34, Shoeb Bhinderwala wrote:
> You didn't understand m
You didn't understand my problem. The exact same code throws out of
memory when I change map to pmap.
My monthly data is evenly divided into 30 sets. For e.g total monthly
data = 9 records, daily data size for each day = 3000 records. I
am trying to achieve performance gain by processing the d
Just a guess. If your daily data is huge you will be loading the data for
only one day when using map and you will be loading the data for multiple
days (equal to number of parallel threads) .. and may be this is the cause
of the problem.
Sunil.
On Sat, Aug 6, 2011 at 11:40 PM, Shoeb Bhinderwala