Re: spark-packages with maven

2016-07-19 Thread Jakob Odersky
Luciano, afaik the spark-package-tool also makes it easy to upload packages to spark-packages website. You are of course free to include any maven coordinate in the --packages parameter --jakob On Fri, Jul 15, 2016 at 1:42 PM, Ismaël Mejía wrote: > Thanks for the info Burak, I will check the rep

Re: spark-packages with maven

2016-07-15 Thread Ismaël Mejía
Thanks for the info Burak, I will check the repo you mention, do you know concretely what is the 'magic' that spark-packages need or if is there any document with info about it ? On Fri, Jul 15, 2016 at 10:12 PM, Luciano Resende wrote: > > On Fri, Jul 15, 2016 at 10:48 AM, Jacek Laskowski wrote

Re: spark-packages with maven

2016-07-15 Thread Luciano Resende
On Fri, Jul 15, 2016 at 10:48 AM, Jacek Laskowski wrote: > +1000 > > Thanks Ismael for bringing this up! I meant to have send it earlier too > since I've been struggling with a sbt-based Scala project for a Spark > package myself this week and haven't yet found out how to do local > publishing. >

Re: spark-packages with maven

2016-07-15 Thread Burak Yavuz
Hi Ismael and Jacek, If you use Maven for building your applications, you may use the spark-package command line tool ( https://github.com/databricks/spark-package-cmd-tool) to perform packaging. It requires you to build your jar using maven first, and then does all the extra magic that Spark Pack

Re: spark-packages with maven

2016-07-15 Thread Jacek Laskowski
+1000 Thanks Ismael for bringing this up! I meant to have send it earlier too since I've been struggling with a sbt-based Scala project for a Spark package myself this week and haven't yet found out how to do local publishing. If such a guide existed for Maven I could use it for sbt easily too :-

Re: spark packages

2015-05-24 Thread Debasish Das
Yup netlib lgpl right now is activated through a profile...if we can reuse the same idea then csparse can also be added to spark with a lgpl flag. But again as Sean said its tricky. Better to keep it on spark packages for users to try. On May 24, 2015 1:36 AM, "Sean Owen" wrote: > I dont believe

Re: spark packages

2015-05-24 Thread Sean Owen
I dont believe we are talking about adding things to the Apache project, but incidentally LGPL is not OK in Apache projects either. On May 24, 2015 6:12 AM, "DB Tsai" wrote: > I thought LGPL is okay but GPL is not okay for Apache project. > > On Saturday, May 23, 2015, Patrick Wendell wrote: > >

Re: spark packages

2015-05-23 Thread Reynold Xin
That's the nice thing about Spark packages. It is just a package index for libraries and applications built on top of Spark and not part of the Spark codebase, so it is not restricted to follow only ASF-compatible licenses. On Sat, May 23, 2015 at 10:12 PM, DB Tsai wrote: > I thought LGPL is ok

Re: spark packages

2015-05-23 Thread DB Tsai
I thought LGPL is okay but GPL is not okay for Apache project. On Saturday, May 23, 2015, Patrick Wendell wrote: > Yes - spark packages can include non ASF licenses. > > On Sat, May 23, 2015 at 6:16 PM, Debasish Das > wrote: > > Hi, > > > > Is it possible to add GPL/LGPL code on spark packages

Re: spark packages

2015-05-23 Thread Patrick Wendell
Yes - spark packages can include non ASF licenses. On Sat, May 23, 2015 at 6:16 PM, Debasish Das wrote: > Hi, > > Is it possible to add GPL/LGPL code on spark packages or it must be licensed > under Apache as well ? > > I want to expose Professor Tim Davis's LGPL library for sparse algebra and >