Jon, Thanks. I think I've figured it out, actually. It's really
simple, one needs to simply set spark.executor.extraClassPath to the
current value of the java class path (java.class.path system
property). Also, to not use HiveContext, which gives errors about
initializing a Derby database mu
Take a look at spark testing base.
https://github.com/holdenk/spark-testing-base/blob/master/README.md
On Apr 17, 2016 10:28 AM, "Evan Chan" wrote:
> What I want to find out is how to run tests like Spark's with
> local-cluster, just like that suite, but in your own projects. Has
> anyone done
What I want to find out is how to run tests like Spark's with
local-cluster, just like that suite, but in your own projects. Has
anyone done this?
On Sun, Apr 17, 2016 at 5:37 AM, Takeshi Yamamuro wrote:
> Hi,
> Is this a bad idea to create `SparkContext` with a `local-cluster` mode by
> yourse
Hi,
Is this a bad idea to create `SparkContext` with a `local-cluster` mode by
yourself like '
https://github.com/apache/spark/blob/master/core/src/test/scala/org/apache/spark/ShuffleSuite.scala#L55
'?
// maropu
On Sun, Apr 17, 2016 at 9:47 AM, Evan Chan wrote:
> Hey folks,
>
> I'd like to use
Hey folks,
I'd like to use local-cluster mode in my Spark-related projects to
test Spark functionality in an automated way in a simulated local
cluster.The idea is to test multi-process things in a much easier
fashion than setting up a real cluster. However, getting this up and
running in a