On Sun, Jan 16, 2011 at 12:32 PM, kunal ghosh <kunal...@gmail.com> wrote:
> Hi all, > I found numpy.memmap > to be very suitable when matrices larger than the physical memory are > required. > > 1. included in standard numpy installation > 2. very low learning curve. > > Interesting package. Just went through the documentation. It is using "mmap" system call to map system virtual memory to the process. But this approach can also get clumsy if your matrices are too huge and you may get into all kinds of paging problems if your system doesn't handle virtual memory very well. But must be fine on Linux I think. > pyTables seems to be more suitable but , i somehow found the learning curve > too steep . Also pyTables needs lot of initializations before anything can > be done with it > as compared to memmap. > > At first look, PyTables might be better than memmap since it uses data compression with disk files so you won't hit the memory limit. > The above reason made me use memmap over pyTable. > > -- --Anand _______________________________________________ BangPypers mailing list BangPypers@python.org http://mail.python.org/mailman/listinfo/bangpypers