On Wed, Jan 20, 2016 at 10:47 PM, Sean Owen <so...@cloudera.com> wrote: > That's not a Spark problem. Your compiler was not available.
It's Spark to use zinc. It starts it for a built, but alas doesn't stop it afterwards. If you happen to upgrade your JDK without restarting zinc, you *will* run into the issue. If you say it's not a Spark problem, whom could that be? I personally don't use zinc. I can't seem to find anyone more guilty than Spark. Sorry. Jacek --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org