Install your custom spark jar to your local maven or ivy repo. Use this custom
jar in your pom/sbt file.
> On May 15, 2014, at 3:28 AM, Andrei wrote:
>
> (Sorry if you have already seen this message - it seems like there were some
> issues delivering messages to the list yesterday)
>
> We
(Sorry if you have already seen this message - it seems like there were
some issues delivering messages to the list yesterday)
We can create standalone Spark application by simply adding
"spark-core_2.x" to build.sbt/pom.xml and connecting it to Spark master.
We can also build custom version of S
We can create standalone Spark application by simply adding
"spark-core_2.x" to build.sbt/pom.xml and connecting it to Spark master.
We can also compile custom version of Spark (e.g. compiled against Hadoop
2.x) from source and deploy it to cluster manually.
But what is a proper way to use _custo