That works.
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/possible-bug-user-SparkConf-properties-not-copied-to-worker-process-tp13665p13689.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
-
Is this through Java properties? For java properties, you can pass them
using spark.executor.extraJavaOptions.
On Thu, Aug 13, 2015 at 2:11 PM, rfarrjr wrote:
> Thanks for the response.
>
> In this particular case we passed a url that would be leveraged when
> configuring some serialization s
Thanks for the response.
In this particular case we passed a url that would be leveraged when
configuring some serialization support for Kryo. We are using a schema
registry and leveraging it to efficiently serialize avro objects without the
need to register specific records or schemas up front.
That was intentional - what's your use case that require configs not
starting with spark?
On Thu, Aug 13, 2015 at 8:16 AM, rfarrjr wrote:
> Ran into an issue setting a property on the SparkConf that wasn't made
> available on the worker. After some digging[1] I noticed that only
> properties t