On Fri, Feb 29, 2008 at 8:27 AM, Liviu Andronic <[EMAIL PROTECTED]> wrote:
> On 2/28/08, Gabor Grothendieck <[EMAIL PROTECTED]> wrote:
> >  The sqldf package can read a subset of rows and columns (actually any
> >  sql operation)
> >  from a file larger than R can otherwise handle.  It will automatically
> >  set up a temporary
> >  SQLite database for you, load the file into the database without going
> >  through R and
> >  extract just the data you want into R and then automatically delete
> >  the database.  All this
> >  can be done in 2 lines of code.
>
> Is it realistic to use this approach for datasets as big as 30-40 GB?

The SQLite site says SQLite is appropriate up to a few dozen gigabytes.
http://www.sqlite.org/whentouse.html

The only way to really know is to try it with your data.  Since it does not
involve much code it shouldn't take long to prepare a test.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to