On Wed, May 9, 2012 at 2:22 PM, John Laing <john.la...@gmail.com> wrote: > For 200,000 analyses at 1.5 seconds each, you're looking at ~83 hours > of computing time. You can buy time from Amazon at roughly $0.08 / > core / hour, so it would cost about $7 to run your analyses in the > cloud. Assuming complete parallelization you could fire up as many > machines as you need to get the work done in as little time as you > want, with the same fixed cost. I think that's a pretty compelling > argument, compared to the hassles of buying and maintaining hardware, > power supply, air conditioning, etc.
Noticing Hugh's .ac.uk email address you do have to factor in the hassle of getting something as nebulous as cloud computing past the red tape. "How much will it cost?" says the bureaucrat. "Depends how much CPU time I need", says the academic. "So potentially, what's the most?" says the bureaucrat. "Millions,", says the academic, honestly, adding "but that would only be if my job scheduling went a bit mad and grabbed a few thousand Amazon cores and thrashed them for weeks without me noticing". "Okay", says the bureaucrat, "now, can we send Amazon a purchase order so that Amazon send us an invoice for this unknown and potentially unpredictable cost first?". "Oh no", says the academic, "we need a credit card...". Maybe there are other ways of paying for Amazon cloud CPUs, I've not investigated. Anyone in academia happily crunching on EC2? Barry ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.