To whom it may concern,  
I am a student from Peking University, China. I am currently doing some 
microarray data analysis research with Bioconductor package of R. 
 
Problem arises when I try to import into R my dataset which contains 109 
samples (total size more than 1.4 GB). The memory limit of R makes importing 
all the samples into one AffyBatch object a "mission impossible" for me. 
 
Though it will be possible to import data into several AffyBatch objects, and 
do the preprocessing respectively. Yet in this case, the results of background 
correction or normalization are not desirable, because not all the information 
known (namely 109 samples) is used to obtain a baseline or something like that. 
 
An alternative approach would be to pre-process the data in dChip, and then 
export it into R. Yet I am thinking about an approach that relies solely on R. 
 
Would you please give some suggestions on this issue, though it might be more a 
technical problem than a scientific (statistical) one? Much thanks for your 
help! look forward to your reply! All the best to your work! 
 
Best regards,
Anqi
        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to