nvm figured. I compiled my client jar with 2.0.2 while the spark that is
deployed on my machines were 2.0.1. communication problems between dev team
and ops team :)
On Fri, Jan 20, 2017 at 3:03 PM, kant kodali wrote:
> Is this because of versioning issue? can't wait for JDK 9 modular system.
> I
Is this because of versioning issue? can't wait for JDK 9 modular system. I
am not sure if spark plans to leverage it?
On Fri, Jan 20, 2017 at 1:30 PM, kant kodali wrote:
> I get the following exception. I am using Spark 2.0.1 and Scala 2.11.8.
>
> org.apache.spark.SparkException: Job aborted du