I can not think of sth better. Maybe try read part of the data that you
want to analyze, basically break the large data set into pieces.


On Fri, Apr 26, 2013 at 10:58 AM, Ye Lin <ye...@lbl.gov> wrote:

> Have you think of build a database then then let R read it thru that db
> instead of your desktop?
>
>
> On Fri, Apr 26, 2013 at 8:09 AM, Kevin Hao <rfans4ch...@gmail.com> wrote:
>
>> Hi all scientists,
>>
>> Recently, I am dealing with big data ( >3G  txt or csv format ) in my
>> desktop (windows 7 - 64 bit version), but I can not read them faster,
>> thought I search from internet. [define colClasses for read.table, cobycol
>> and limma packages I have use them, but it is not so fast].
>>
>> Could you share your methods to read big data to R faster?
>>
>> Though this is an odd question, but we need it really.
>>
>> Any suggest appreciates.
>>
>> Thank you very much.
>>
>>
>> kevin
>>
>>         [[alternative HTML version deleted]]
>>
>> ______________________________________________
>> R-help@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide
>> http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>>
>
>

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to