Hi zeppelin pilots,

I am trying to run multiple spark interpreters in the same Zeppelin
instance. This is very helpful if the data comes from multiple spark
clusters.

Another useful use case is that, run one instance in cluster mode, and
another in local mode. This will significantly boost the performance of
small data analysis.

Is there anyway to run multiple spark interpreters? I tried to create
another spark interpreter with a different identifier, which is allowed in
UI. But it doesn't work (shall I file a ticket?)

I am now trying running multiple sparkContext in the same spark interpreter.

Zhong

Reply via email to