Thanks for letting me know, I am leaning towards using Whirr to setup a Yarn cluster with Hive, Pig, Hbase, etc... and then adding Spark on Yarn. Is it pretty straightforward to install Spark on a Yarn cluster?
On Fri, May 30, 2014 at 5:51 PM, Matei Zaharia <matei.zaha...@gmail.com> wrote: > I don’t think Whirr provides support for this, but Spark’s own EC2 scripts > also launch a Hadoop cluster: > http://spark.apache.org/docs/latest/ec2-scripts.html. > > Matei > > On May 30, 2014, at 12:59 PM, chirag lakhani <chirag.lakh...@gmail.com> > wrote: > > > Does anyone know if it is possible to use Whirr to setup a Spark cluster > on AWS. I would like to be able to use Whirr to setup a cluster that has > all of the standard Hadoop and Spark tools. I want to automate this > process because I anticipate I will have to create and destroy often enough > that I would like to have it all automated. Could anyone provide any > pointers into how this could be done or whether it is documented somewhere? > > > > Chirag Lakhani > >