The “best” solution to spark-shell’s  problem is creating a file 
$SPARK_HOME/conf/java-opts
with “-Dhdp.version=2.2.0.0-2014”

Cheers,

Doug

> On Mar 28, 2015, at 1:25 PM, Michael Stone <[email protected]> wrote:
> 
> I've also been having trouble running 1.3.0 on HDP. The 
> spark.yarn.am.extraJavaOptions -Dhdp.version=2.2.0.0-2041
> configuration directive seems to work with pyspark, but not propagate when 
> using spark-shell. (That is, everything works find with pyspark, and 
> spark-shell fails with the "bad substitution" message.)
> 
> Mike Stone
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to