Only a couple of weeks ago I had to deal with this.

adjust the memory limit as follows, although you might not want 4000, that is quite high....

memory.limit(size = 4000)

Simon.

----- Original Message ----- From: "Edwin Sendjaja" <edw...@web.de>
To: "Simon Pickett" <simon.pick...@bto.org>
Cc: <r-help@r-project.org>
Sent: Tuesday, January 06, 2009 12:24 PM
Subject: Re: [R] Large Dataset


Hi Simon,

Thank for your reply.
I have read ?Memory but I dont understand how to use. I am not sure if that
can solve my problem. Can you tell me more detail?

Thanks,

Edwin

type

?memory

into R and that will explain what to do...

S
----- Original Message -----
From: "Edwin Sendjaja" <edw...@web.de>
To: <r-help@r-project.org>
Sent: Tuesday, January 06, 2009 11:41 AM
Subject: [R] Large Dataset

> Hi alI,
>
> I  have a 3.1 GB Dataset ( with  11 coloumns and lots data in int and
> string).
> If I use read.table; it takes very long. It seems that my RAM is not > big
> enough (overload) I have 3.2 RAM and  7GB SWAP, 64 Bit Ubuntu.
>
> Is there a best sultion to read a large data R? I have seen, that > people
> suggest to use bigmemory package, ff. But it seems very complicated.  I
> dont
> know how to start with that packages.
>
> i have tried to use bigmemory. But I got some kind of errors.  Then I
> gave up.
>
>
> can someone give me an simple example how ot use ff or bigmemory?or > maybe
> re
> better sollution?
>
>
>
> Thank you in advance,
>
>
> Edwin
>
> ______________________________________________
> R-help@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.




______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to