Hi Jeff
Ok, I'll update PR this evening to remove environment variable... Although,
in some cases (for example for Docker), environment variable could be more
handy - I need to look, maybe I'll rework that part so environment variable
could be also used.
SparkLauncher is getting Spark master as m
The env name in interpreter.json and interpreter-setting.json is not used.
We should remove them.
I still don't understand how master & spark.master would effect the
behavior. `master` is a legacy stuff that we introduced very long time ago,
we definitely should use spark.master instead. But actua
I've seen somewhere in CDH documentation that they use MASTER, that's why
I'm asking...
On Sun, May 17, 2020 at 3:13 PM Patrik Iselind wrote:
> Thanks a lot for creating the issue. It seems I am not allowed to.
>
> As I understand it, the environment variable is supposed to be
> SPARK_MASTER.
>
Thanks a lot for creating the issue. It seems I am not allowed to.
As I understand it, the environment variable is supposed to be SPARK_MASTER.
On Sun, May 17, 2020 at 11:56 AM Alex Ott wrote:
> Ok, I've created a JIRA for it:
> https://issues.apache.org/jira/browse/ZEPPELIN-4821 and working on
Ok, I've created a JIRA for it:
https://issues.apache.org/jira/browse/ZEPPELIN-4821 and working on patch
I'm not sure about environment variable name - it's simply MASTER, should
it be `SPARK_MASTER`, or it's a requirement of CDH and other Hadoop
distributions to have it as MASTER?
On Sat, May 16
Thank you for clarification Patrik
Can you create JIRA for tracking & fixing of this?
thanks
Patrik Iselind at "Sat, 16 May 2020 15:45:07 +0200" wrote:
PI> Hi Alex,
PI> Thanks a lot for helping out with this.
PI> You're correct, but it doesn't seem that it's the
interpreter-settings.json
Hi Alex,
Thanks a lot for helping out with this.
You're correct, but it doesn't seem that it's the interpreter-settings.json
for Spark interpreter that is being used. It's conf/interpreter.json. In
this file both 0.8.2 and 0.9.0 have
```partial-json
"spark": {
"id": "spark",
"name
Spark master is set to `local[*]` by default. Here is corresponding piece
form interpreter-settings.json for Spark interpreter:
"master": {
"envName": "MASTER",
"propertyName": "spark.master",
"defaultValue": "local[*]",
"description": "Spark master uri. local
Hi Jeff,
I've tried the release from http://zeppelin.apache.org/download.html, both
in a docker and without a docker. They both have the same issue as
previously described.
Can I somehow set spark.master to "local[*]" in zeppelin, perhaps using
some environment variable?
When is the next Zeppeli
Hi Patric,
Do you mind to try the 0.9.0-preview, it might be an issue of docker
container.
http://zeppelin.apache.org/download.html
Patrik Iselind 于2020年5月10日周日 上午2:30写道:
> Hello Jeff,
>
> Thank you for looking into this for me.
>
> Using the latest pushed docker image for 0.9.0 (image ID 92
Hello Jeff,
Thank you for looking into this for me.
Using the latest pushed docker image for 0.9.0 (image ID 92890adfadfb,
built 6 weeks ago), I still see the same issue. My image has the
digest "apache/zeppelin@sha256
:0691909f6884319d366f5d3a5add8802738d6240a83b2e53e980caeb6c658092".
If it's n
This might be a bug of 0.8, I tried that in 0.9 (master branch), it works
for me.
print(sc.master)
print(sc.defaultParallelism)
---
local[*] 8
Patrik Iselind 于2020年5月9日周六 下午8:34写道:
> Hi,
>
> First comes some background, then I have some questions.
>
> *Background*
> I'm trying out Zeppelin 0.
Hi,
First comes some background, then I have some questions.
*Background*
I'm trying out Zeppelin 0.8.2 based on the Docker image. My Docker file
looks like this:
```Dockerfile
FROM apache/zeppelin:0.8.2
# Install Java and some tools
RUN apt-get -y update &&\
DEBIAN_FRONTEND=noninteractive
13 matches
Mail list logo