RE: Is spark-ec2 for production use?

2015-04-22 Thread nate
definitely handy for some smoke tests -Original Message- From: Nicholas Chammas [mailto:nicholas.cham...@gmail.com] Sent: Tuesday, April 21, 2015 2:33 PM To: n...@reactor8.com; Spark dev list Subject: Re: Is spark-ec2 for production use? Nate, could you point us to an example of how one

RE: Is spark-ec2 for production use?

2015-04-21 Thread nate
/deploy and enhance CI process so if you choose to deploy via bigtop in test/prod/etc you know things have gone through a certain amount of rigor beforehand Nate -Original Message- From: Patrick Wendell [mailto:pwend...@gmail.com] Sent: Tuesday, April 21, 2015 12:46 PM To: Nicholas Chammas

RE: Keep or remove Debian packaging in Spark?

2015-02-09 Thread nate
1.4? And add a pointer to any third-party packaging that might provide similar functionality? On Mon, Feb 9, 2015 at 6:47 PM, Nicholas Chammas wrote: > +1 to an "official" deprecation + redirecting users to some other > +project > that will or already is taking this on. &

RE: Standardized Spark dev environment

2015-01-20 Thread nate
Alto office on Jan 27th if any folks are interested. Nate -Original Message- From: Sean Owen [mailto:so...@cloudera.com] Sent: Tuesday, January 20, 2015 5:09 PM To: Nicholas Chammas Cc: dev Subject: Re: Standardized Spark dev environment My concern would mostly be maintenance. It adds t

RE: EC2 clusters ready in launch time + 30 seconds

2014-10-02 Thread Nate D'Amico
our own product release and the bigtop work Nate -Original Message- From: David Rowe [mailto:davidr...@gmail.com] Sent: Thursday, October 02, 2014 4:44 PM To: Nicholas Chammas Cc: dev; Shivaram Venkataraman Subject: Re: EC2 clusters ready in launch time + 30 seconds I think this is

RE: EC2 clusters ready in launch time + 30 seconds

2014-07-10 Thread Nate D'Amico
y with: 3) instance type (need both standard and hvm) Starting to work through some automation/config stuff for spark stack on EC2 with a project, will be focusing the work through the apache bigtop effort to start, can then share with spark community directly as things progress if people