Source code is the primary release; compiled binary releases are
conveniences that are also released. A docker image sounds fairly different
though. To the extent it's the standard delivery mechanism for some
artifact (think: pyspark on PyPI as well) that makes sense, but is that the
situation? if it's more of an extension or alternate presentation of Spark
components, that typically wouldn't be part of a Spark release. The ones
the PMC takes responsibility for maintaining ought to be the core, critical
means of distribution alone.

On Wed, Nov 29, 2017 at 2:52 AM Anirudh Ramanathan
<ramanath...@google.com.invalid> wrote:

> Hi all,
>
> We're all working towards the Kubernetes scheduler backend (full steam
> ahead!) that's targeted towards Spark 2.3. One of the questions that comes
> up often is docker images.
>
> While we're making available dockerfiles to allow people to create their
> own docker images from source, ideally, we'd want to publish official
> docker images as part of the release process.
>
> I understand that the ASF has procedure around this, and we would want to
> get that started to help us get these artifacts published by 2.3. I'd love
> to get a discussion around this started, and the thoughts of the community
> regarding this.
>
> --
> Thanks,
> Anirudh Ramanathan
>

Reply via email to