Hi,
I've written the following line of code to make a summary of some data:
Final.Data.Short <- as.data.frame(aggregate(Merge.FinalSubset[,8:167],
list(Location = Merge.FinalSubset $Location,Measure = Merge.FinalSubset
$Measure,Site = Merge.FinalSubset $Site, Label= Merge.FinalSubset $Label),
FUN=sum))
Where "Merge.FinalSubset" is a dataframe of 2640 rows and 167 columns
The result "Final.Data.Short" is a dataframe of 890 rows and 164 columns
This operation takes at the moment more than a minute. Now I was wondering
if their exist ways to reduce this operation time by using other code or by
splitting the original dataframe in smaller bits, make several different
aggregations, and recompose the dataframe again?
Thx for helping me out
Bert
[[alternative HTML version deleted]]
______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.