No, CDH 5.2 includes Spark 1.1 actually, which is the latest released
minor version; 5.3 will include 1.2, which not released yet.

You can make a build of just about any version of Spark for CDH5, and
manually install it yourself, sure, but easier would be to just update
CDH. The instructions are the ones you've found; all you need to know
is to use the hadoop-2.4 profile (which is for 2.5 and 2.6 too) and
set hadoop.version appropriately.

On Fri, Dec 12, 2014 at 12:26 PM, Jing Dong <j...@qubitdigital.com> wrote:
> Hi,
>
>     I'm new to this list, so please excuse if I'm asking simple questions.
>
>    We are experimenting spark deployment on existing CDH clusters.
> However the spark package come with CDH are very out of date (v1.0.0).
>
>     Has anyone had experience with custom Spark upgrade for CDH5? Any
> installation or packaging recommendation would be appreciate.
>
>     The download page and documentation site only mention about CDH4
> pre build package.
>
>
> Thanks
> Jing
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to