l
> oneJar="/samplesparkmaven/target/sample-spark-maven-one-jar.jar"
> sparkConf.setJars(List(oneJar)) val sc = new SparkContext(sparkConf) I'm
> using Spark 2.1.0 in standalone mode with master and one worker. Does
> anyone have idea where the problem might be or how
Hi,I'm trying to create application that would programmatically submit jar
file to Spark standalone cluster running on my local PC. However, I'm always
getting the error WARN TaskSetManager:66 - Lost task 1.0 in stage 0.0 (TID
1, 192.168.2.68, executor 0): java.lang.RuntimeException: Stream
'/jars