o: "Sean Owen"
>> Cc: "dev" , "jay vyas" ,
>> "Paolo Platter"
>> , "Nicholas Chammas"
>> , "Will Benton"
>> Sent: Wednesday, January 21, 2015 2:09:35 AM
>> Subject: Re: Standardized Spark dev envir
- Original Message -
> From: "Patrick Wendell"
> To: "Sean Owen"
> Cc: "dev" , "jay vyas" ,
> "Paolo Platter"
> , "Nicholas Chammas" ,
> "Will Benton"
> Sent: Wednesday, January 21, 2015 2:09:3
Sure, can Jenkins use this new image too? If not then it doesn't help with
reproducing a Jenkins failure, most of which even Jenkins can't reproduce.
But if it does and it can be used for builds then that does seem like it is
reducing rather than increasing environment configurations which is good.
> If the goal is a reproducible test environment then I think that is what
> Jenkins is. Granted you can only ask it for a test. But presumably you get
> the same result if you start from the same VM image as Jenkins and run the
> same steps.
But the issue is when users can't reproduce Jenkins fai
Inviato: 21/01/2015 04:45
> > A: Nicholas Chammas<mailto:nicholas.cham...@gmail.com>
> > Cc: Will Benton<mailto:wi...@redhat.com>; Spark dev list dev@spark.apache.org>
> > Oggetto: Re: Standardized Spark dev environment
> >
> > I can comment on both... hi will and
lt;mailto:nicholas.cham...@gmail.com>
> Cc: Will Benton<mailto:wi...@redhat.com>; Spark dev
> list<mailto:dev@spark.apache.org>
> Oggetto: Re: Standardized Spark dev environment
>
> I can comment on both... hi will and nate :)
>
> 1) Will's Dockerfile solution
4:45
A: Nicholas Chammas<mailto:nicholas.cham...@gmail.com>
Cc: Will Benton<mailto:wi...@redhat.com>; Spark dev
list<mailto:dev@spark.apache.org>
Oggetto: Re: Standardized Spark dev environment
I can comment on both... hi will and nate :)
1) Will's Dockerfile solution is the mo
pendencies for the current Spark master, but it
> > would be trivial to do so:
> >
> > http://chapeau.freevariable.com/2014/08/jvm-test-docker.html
> >
> >
> > best,
> > wb
> >
> >
> > - Original Message -
> > > From:
t;
>
> best,
> wb
>
>
> - Original Message -
> > From: "Nicholas Chammas"
> > To: "Spark dev list"
> > Sent: Tuesday, January 20, 2015 6:13:31 PM
> > Subject: Standardized Spark dev environment
> >
> > What do y'all
ssage -
> From: "Nicholas Chammas"
> To: "Spark dev list"
> Sent: Tuesday, January 20, 2015 6:13:31 PM
> Subject: Standardized Spark dev environment
>
> What do y'all think of creating a standardized Spark development
> environment, perhaps encoded as a Vagr
Alto office on Jan
27th if any folks are interested.
Nate
-Original Message-
From: Sean Owen [mailto:so...@cloudera.com]
Sent: Tuesday, January 20, 2015 5:09 PM
To: Nicholas Chammas
Cc: dev
Subject: Re: Standardized Spark dev environment
My concern would mostly be maintenance. It adds t
My concern would mostly be maintenance. It adds to an already very complex
build. It only assists developers who are a small audience. What does this
provide, concretely?
On Jan 21, 2015 12:14 AM, "Nicholas Chammas"
wrote:
> What do y'all think of creating a standardized Spark development
> envir
How many profiles (hadoop / hive /scala) would this development environment
support ?
Cheers
On Tue, Jan 20, 2015 at 4:13 PM, Nicholas Chammas <
nicholas.cham...@gmail.com> wrote:
> What do y'all think of creating a standardized Spark development
> environment, perhaps encoded as a Vagrantfile,
Great suggestion.
On Jan 20, 2015 7:14 PM, "Nicholas Chammas"
wrote:
> What do y'all think of creating a standardized Spark development
> environment, perhaps encoded as a Vagrantfile, and publishing it under
> `dev/`?
>
> The goal would be to make it easier for new developers to get started with
What do y'all think of creating a standardized Spark development
environment, perhaps encoded as a Vagrantfile, and publishing it under
`dev/`?
The goal would be to make it easier for new developers to get started with
all the right configs and tools pre-installed.
If we use something like Vagran
15 matches
Mail list logo