Hi,
I'm trying to build a docker image for Zeppelin in which I'll be able to use a
spark standalone cluster. For this I understand that I need to include a Spark
installation and point to it with the environment variable SPARK_HOME. I think
I've done this correctly, but it doesn
于2020年6月22日周一 上午6:21写道:
>>
>>>
>>> The only change I am making is to set SPARK_HOME I have made the setting
>>> in config files bashed file . In the Zeppelin interpreter settings. I am
>>> trying to
a:1654)
> at org.apache.spark.storage.StorageUtils$.(StorageUtils.scala:207)
> at org.apache.spark.storage.StorageUtils$.(StorageUtils.scala)
> ... 28 more
>
> Anwar AliKhan 于2020年6月22日周一 上午6:21写道:
>
>>
>> The only change I am making is to set SPARK_HOME I have made the s
ommons.lang3.SystemUtils.isJavaVersionAtLeast(SystemUtils.java:1654)
> at org.apache.spark.storage.StorageUtils$.(StorageUtils.scala:207)
> at org.apache.spark.storage.StorageUtils$.(StorageUtils.scala)
> ... 28 more
>
> Anwar AliKhan 于2020年6月22日周一 上午6:21写道:
>
>>
>> The only
AliKhan 于2020年6月22日周一 上午6:21写道:
>
> The only change I am making is to set SPARK_HOME I have made the setting
> in config files bashed file . In the Zeppelin interpreter settings. I am
> trying to run scala files which comes Zeppelin so I can develop spark scala
> app. I keep the sa
The only change I am making is to set SPARK_HOME I have made the setting in
config files bashed file . In the Zeppelin interpreter settings. I am
trying to run scala files which comes Zeppelin so I can develop spark scala
app. I keep the same same message. Any ideas