You can get even better improvement using the 'data.table' package:

> require(data.table)
> system.time({
+     dt <- data.table(value = x, z = z)
+     r3 <- dt[
+             , list(sum = sum(value))
+             , keyby = z
+             ]
+ })
   user  system elapsed
   0.14    0.00    0.14


On Thu, Oct 25, 2012 at 11:23 PM, stats12 <ska...@gmail.com> wrote:
> Dear R users,
>
> I need to run 1000 simulations to find maximum likelihood estimates.  I
> print my output as a vector. However, it is taking too long. I am running 50
> simulations at a time and it is taking me 30 minutes. Once I tried to run
> 200 simulations at once, after 2 hours I stopped it and saw that only about
> 40 of them are simulated in those 2 hours. Is there any way to make my
> simulations faster? (I can post my code if needed, I'm just looking for
> general ideas here). Thank you in advance.
>
>
>
> --
> View this message in context: 
> http://r.789695.n4.nabble.com/how-to-make-simulation-faster-tp4647492.html
> Sent from the R help mailing list archive at Nabble.com.
>
> ______________________________________________
> R-help@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.



-- 
Jim Holtman
Data Munger Guru

What is the problem that you are trying to solve?
Tell me what you want to do, not how you want to do it.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to