What is the approach to set custom properties dynamically for the conf
within a notebook?
I understand zeppelin already sets up a spark context for you. Once you
have a SparkContext it can’t be changed.
I may be missing something that the documentation doesn’t mention. Perhaps
something to do with %spark
Specifically, a native Scala elasticsearch spark source would look like:
import org.apache.spark.SparkConf val conf = new
SparkConf().setAppName(“sampleapp").setMaster(“local")
conf.set("es.index.auto.create", "true”)
conf.set("es.nodes", "192.168.51.50”)
val sc = new SparkContext(conf)
There may be times you may want to connect to a different IP for a
different cluster.