On Sat, Jan 17, 2009 at 7:59 AM, gundalav <gunda...@gmail.com> wrote: > Dear Jim and all, > > Allow me to ask your expert opinion. > > > Using the data (16Mb) downloadable from here: > > http://drop.io/gundalav/asset/test-data-zip > > > It took this long under 1994.070Mhz Cpu Linux, using > "write.table" > >> proc.time() - ptm1 > user system elapsed > 16581.833 5787.228 21386.064 > > > > __MYCODE__ > > args <- commandArgs(trailingOnly=FALSE) > fname <- args[3] > dat <- read.delim(fname, header=FALSE); > > output <- file('output_writetable.txt', 'w') > > > ptm1 <- proc.time() > for (i in 1:nrow(dat)) { > > #cat(dat$V1[i]," ", as.character(dat$V2[i]),"\n", sep="") > write.table(cbind(dat$V1[i], as.character(dat$V2[i])), > file=output, sep="\t", quote=FALSE, col.names=FALSE, row.names=FALSE) > } > > close(output) > proc.time() - ptm1 > __END__ > > Perhaps I misunderstood you. But seems that this is > truly slow. Is there a way I can speed it up?
Don't do it line by line! write.table(dat[, c("V1", "V2")], file='output_writetable.txt', sep="\t", quote=FALSE, col.names=FALSE, row.names=FALSE) Hadley -- http://had.co.nz/ ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.