If there is some interest in more standardization and setup of dev/test 
environments spark community might be interested in starting to participate in 
apache bigtop effort:

http://bigtop.apache.org/

While the project had its start and initial focus on packaging, testing, 
deploying Hadoop/hdfs related stack its looking like we will be targeting "data 
engineers" going forward, thus spark is looking to become bigger central piece 
to bigtop effort as the project moves towards a "v1" release.

We will be doing a bigtop/bigdata workshop late Feb at the SocalLinux 
Conference:

http://www.socallinuxexpo.org/scale/13x

Right now scoping some content that will be getting started spark related for 
the event, targeted intro of bigtop/spark puppet powered deployment components 
going into the event as well.

Also the group will be holding a meetup at Amazon's Palo Alto office on Jan 
27th if any folks are interested.

Nate

-----Original Message-----
From: Sean Owen [mailto:so...@cloudera.com] 
Sent: Tuesday, January 20, 2015 5:09 PM
To: Nicholas Chammas
Cc: dev
Subject: Re: Standardized Spark dev environment

My concern would mostly be maintenance. It adds to an already very complex 
build. It only assists developers who are a small audience. What does this 
provide, concretely?
On Jan 21, 2015 12:14 AM, "Nicholas Chammas" <nicholas.cham...@gmail.com>
wrote:

> What do y'all think of creating a standardized Spark development 
> environment, perhaps encoded as a Vagrantfile, and publishing it under 
> `dev/`?
>
> The goal would be to make it easier for new developers to get started 
> with all the right configs and tools pre-installed.
>
> If we use something like Vagrant, we may even be able to make it so 
> that a single Vagrantfile creates equivalent development environments 
> across OS X, Linux, and Windows, without having to do much (or any) 
> OS-specific work.
>
> I imagine for committers and regular contributors, this exercise may 
> seem pointless, since y'all are probably already very comfortable with 
> your workflow.
>
> I wonder, though, if any of you think this would be worthwhile as a 
> improvement to the "new Spark developer" experience.
>
> Nick
>


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to