Using separate SPARK_HOME in Zeppelin

2020-10-07 Thread Patrik Iselind
Hi, I'm trying to build a docker image for Zeppelin in which I'll be able to use a spark standalone cluster. For this I understand that I need to include a Spark installation and point to it with the environment variable SPARK_HOME. I think I've done this correctly, but it doesn

Re: SPARK_HOME

2020-06-22 Thread Jeff Zhang
于2020年6月22日周一 上午6:21写道: >> >>> >>> The only change I am making is to set SPARK_HOME I have made the setting >>> in config files bashed file . In the Zeppelin interpreter settings. I am >>> trying to

Re: SPARK_HOME

2020-06-22 Thread Anwar AliKhan
a:1654) > at org.apache.spark.storage.StorageUtils$.(StorageUtils.scala:207) > at org.apache.spark.storage.StorageUtils$.(StorageUtils.scala) > ... 28 more > > Anwar AliKhan 于2020年6月22日周一 上午6:21写道: > >> >> The only change I am making is to set SPARK_HOME I have made the s

Re: SPARK_HOME

2020-06-22 Thread Anwar AliKhan
ommons.lang3.SystemUtils.isJavaVersionAtLeast(SystemUtils.java:1654) > at org.apache.spark.storage.StorageUtils$.(StorageUtils.scala:207) > at org.apache.spark.storage.StorageUtils$.(StorageUtils.scala) > ... 28 more > > Anwar AliKhan 于2020年6月22日周一 上午6:21写道: > >> >> The only

Re: SPARK_HOME

2020-06-21 Thread Jeff Zhang
AliKhan 于2020年6月22日周一 上午6:21写道: > > The only change I am making is to set SPARK_HOME I have made the setting > in config files bashed file . In the Zeppelin interpreter settings. I am > trying to run scala files which comes Zeppelin so I can develop spark scala > app. I keep the sa

SPARK_HOME

2020-06-21 Thread Anwar AliKhan
The only change I am making is to set SPARK_HOME I have made the setting in config files bashed file . In the Zeppelin interpreter settings. I am trying to run scala files which comes Zeppelin so I can develop spark scala app. I keep the same same message. Any ideas