On Sun, Jan 16, 2011 at 12:32 PM, kunal ghosh wrote:
> Hi all,
> I found numpy.memmap
> to be very suitable when matrices larger than the physical memory are
> required.
>
> 1. included in standard numpy installation
> 2. very low learning curve.
>
>
Interesting package. Just went through the d
Hi all,
I found numpy.memmap
to be very suitable when matrices larger than the physical memory are
required.
1. included in standard numpy installation
2. very low learning curve.
pyTables seems to be more suitable but , i somehow found the learning curve
too steep . Also pyTables needs lot of in
Thanks Santosh ,
This stack overflow thread indeed discusses the exact same problem i have.
Wonder how i missed it :) in my preliminary searches.
thanks again !
On Sat, Jan 15, 2011 at 11:12 PM, Santosh Rajan wrote:
> Hope this helps
> http://stackoverflow.com/questions/1053928/python-numpy-ve
Hope this helps
http://stackoverflow.com/questions/1053928/python-numpy-very-large-matrices
On Sat, Jan 15, 2011 at 10:11 PM, kunal ghosh wrote:
> Hi all,
> while implementing Locality Preserving Projections ,
> at one point i have to perform X L X.transpose()
> these matrices are large (32256 x