Thanks for the guidance! Setting the --driver-java-options in spark-shell
instead of SPARK_MASTER_OPTS made the debugger connect to the right JVM. My
breakpoints get hit now.
nirandap [via Apache Spark Developers List] <
ml-node+s1001551n18145...@n3.nabble.com> schrieb am Fr., 1. Juli 2016 um
04:3
Guys,
Aren't TaskScheduler and DAGScheduler residing in the spark context? So,
the debug configs need to be set in the JVM where the spark context is
running? [1]
But yes, I agree, if you really need to check the execution, you need to
set those configs in the executors [2]
[1]
https://jaceklask
Yes, scheduling is centralized in the driver.
For debugging, I think you'd want to set the executor JVM, not the worker
JVM flags.
On Thu, Jun 30, 2016 at 11:36 AM, cbruegg wrote:
> Hello everyone,
>
> I'm a student assistant in research at the University of Paderborn, working
> on integrating