Oh ok. So from the code base, local execution is dependent on everyone's
way then, right? I am indeed changing the code to add the master to
local[*], but still getting the no classdef found errors.
If that's the case, then I think I'm ok then...
Thanks,
Ron
On 08/24/2014 04:21 PM, Sean Owen
The examples aren't runnable quite like this. It's intended that they
are submitted to a cluster with spark-submit, which would among other
things provide Spark at runtime.
I think you might get them to run this way if you set master to
"local[*]" and indeed made a run profile that also included S
Hi,
After getting the code base to compile, I tried running some of the
scala examples.
They all fail since it can't find classes like SparkConf.
If I change the iml file to convert provided scope from PROVIDED to
COMPILE, I am able to run them. It's simple by doing the following in
the