Hi,

I'm having trouble trying to figure out where to pass "conf" spark-submit
arguments to a spark job service. I don't particularly care at this point
whether the job service uses the same set of args for all jobs passed to
the service, I'm just having trouble finding where I can send these args
where they will get picked up by the service.

For reference, I'm using the portable pipeline options (IE: the portable
runner, job endpoint, and DOCKER environment type).

Has anyone tried this, specifically in Java? I'm actually writing in
Kotlin, but all the examples of the portable runner are in Python, which is
significantly different from the Java design.

Thanks!

Reply via email to