Luciano,
afaik the spark-package-tool also makes it easy to upload packages to
spark-packages website. You are of course free to include any maven
coordinate in the --packages parameter
--jakob
On Fri, Jul 15, 2016 at 1:42 PM, Ismaël Mejía wrote:
> Thanks for the info Burak, I will check the rep
Thanks for the info Burak, I will check the repo you mention, do you know
concretely what is the 'magic' that spark-packages need or if is there any
document with info about it ?
On Fri, Jul 15, 2016 at 10:12 PM, Luciano Resende
wrote:
>
> On Fri, Jul 15, 2016 at 10:48 AM, Jacek Laskowski wrote
On Fri, Jul 15, 2016 at 10:48 AM, Jacek Laskowski wrote:
> +1000
>
> Thanks Ismael for bringing this up! I meant to have send it earlier too
> since I've been struggling with a sbt-based Scala project for a Spark
> package myself this week and haven't yet found out how to do local
> publishing.
>
Hi Ismael and Jacek,
If you use Maven for building your applications, you may use the
spark-package command line tool (
https://github.com/databricks/spark-package-cmd-tool) to perform packaging.
It requires you to build your jar using maven first, and then does all the
extra magic that Spark Pack
+1000
Thanks Ismael for bringing this up! I meant to have send it earlier too
since I've been struggling with a sbt-based Scala project for a Spark
package myself this week and haven't yet found out how to do local
publishing.
If such a guide existed for Maven I could use it for sbt easily too :-
Hello, I would like to know if there is an easy way to package a new
spark-package
with maven, I just found this repo, but I am not an sbt user.
https://github.com/databricks/sbt-spark-package
One more question, is there a formal specification or documentation of what
do
you need to include in a