That's not a Spark problem. Your compiler was not available. On Wed, Jan 20, 2016 at 10:44 PM, Jacek Laskowski <ja...@japila.pl> wrote: > On Wed, Jan 20, 2016 at 8:48 PM, Marcelo Vanzin <van...@cloudera.com> wrote: >> On Wed, Jan 20, 2016 at 11:46 AM, Jacek Laskowski <ja...@japila.pl> wrote: >>> /Users/jacek/dev/oss/spark/tags/target/scala-2.11/classes... >>> [error] Cannot run program "javac": error=2, No such file or directory >> >> That doesn't exactly look like a Spark problem. > > It *was* a Spark problem. The issue was that zinc was up while I > upgraded JDK and eventually it couldn't find proper binaries. When I > killed com.typesafe.zinc.Nailgun the build went fine. > > I remember I saw the issue reported in the past and when I was > completely hopeless to figure it out without rebooting the machine the > idea of zinc being "misconfigured" came! `jps -lm` to the rescue! > > Sorry for the noise. > > Jacek > > --------------------------------------------------------------------- > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org > For additional commands, e-mail: dev-h...@spark.apache.org >
--------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org