We recently benchmarked our R Server (Intel Xeon 2.2GHz, 128 GB RAM, Centos 6.2 
running R 2.15.2 64bit) where we tested various read / write / data 
manipulation times. A 6 GB dataset took around 15 minutes to read without 
colClassses. The dataset had around 10 million rows and 14 columns.



Were your times comparable to this?



Regards,

Indrajit





On Fri, 26 Apr 2013 23:19:12 +0530  wrote

>Hi all scientists,







Recently, I am dealing with big data ( >3G txt or csv format ) in my



desktop (windows 7 - 64 bit version), but I can not read them faster,



thought I search from internet. [define colClasses for read.table, cobycol



and limma packages I have use them, but it is not so fast].







Could you share your methods to read big data to R faster?







Though this is an odd question, but we need it really.







Any suggest appreciates.







Thank you very much.











kevin







[[alternative HTML version deleted]]







______________________________________________



R-help@r-project.org mailing list



https://stat.ethz.ch/mailman/listinfo/r-help



PLEASE do read the posting guide http://www.R-project.org/posting-guide.html



and provide commented, minimal, self-contained, reproducible code.




        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to