On Mon, Oct 18, 2010 at 9:55 AM, Bond, Stephen <stephen.b...@cibc.com> wrote:
> Gabor,
>
> You are suggesting some very advanced usage that I do not understand, but it 
> seems this is not what I meant when I said loop.
> I have a df with 47k rows and each of these is fed to a 'predict' which will 
> output about 62 rows, so the number of groups is very large and I implied 
> that I would go through the 47k x 62 rows with
>
> For (jj in (set of 47k values)) # tmp.df=big.df[big.df$group==jj,] to subset
>                                # and then sum
>
> Which is very slow. I discovered that even creating the dataset is super slow 
> as I use write.table
>
> The clogging comes from
>
> write.table(tmp,"predcom.csv",row.names=FALSE,col.names=FALSE,append=TRUE,sep=',')
>
> Can anybody suggest a faster way of appending to a text file??
>
> All comments are appreciated.

If the problem is to sum each row of a matrix then rowSums can do that
without a loop.

-- 
Statistics & Software Consulting
GKX Group, GKX Associates Inc.
tel: 1-877-GKX-GROUP
email: ggrothendieck at gmail.com

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to