Re: Publishing of the Spectral LDA model on Spark Packages

2016-12-08 Thread François Garillot
This is very cool ! Thanks a lot for making this more accessible ! Best, -- FG On Wed, Dec 7, 2016 at 11:46 PM Jencir Lee wrote: > Hello, > > We just published the Spectral LDA model on Spark Packages. It’s an > alternative approach to the LDA modelling based on tensor decompos

Publishing of the Spectral LDA model on Spark Packages

2016-12-07 Thread Jencir Lee
Hello, We just published the Spectral LDA model on Spark Packages. It’s an alternative approach to the LDA modelling based on tensor decompositions. We first build the 2nd, 3rd-moment tensors from empirical word counts, then orthogonalise them and perform decomposition on the 3rd-moment tensor

Re: spark-packages with maven

2016-07-19 Thread Jakob Odersky
Luciano, afaik the spark-package-tool also makes it easy to upload packages to spark-packages website. You are of course free to include any maven coordinate in the --packages parameter --jakob On Fri, Jul 15, 2016 at 1:42 PM, Ismaël Mejía wrote: > Thanks for the info Burak, I will check

Re: spark-packages with maven

2016-07-15 Thread Ismaël Mejía
Thanks for the info Burak, I will check the repo you mention, do you know concretely what is the 'magic' that spark-packages need or if is there any document with info about it ? On Fri, Jul 15, 2016 at 10:12 PM, Luciano Resende wrote: > > On Fri, Jul 15, 2016 at 10:48 AM,

Re: spark-packages with maven

2016-07-15 Thread Luciano Resende
ere a formal specification or documentation of > what do > you need to include in a spark-package (any special file, manifest, etc) ? > I > have not found any doc in the website. > > Thanks, > Ismael > > > I was under the impression that spark-packages was more like

Re: spark-packages with maven

2016-07-15 Thread Burak Yavuz
Hi Ismael and Jacek, If you use Maven for building your applications, you may use the spark-package command line tool ( https://github.com/databricks/spark-package-cmd-tool) to perform packaging. It requires you to build your jar using maven first, and then does all the extra magic that Spark Pack

Re: spark-packages with maven

2016-07-15 Thread Jacek Laskowski
+1000 Thanks Ismael for bringing this up! I meant to have send it earlier too since I've been struggling with a sbt-based Scala project for a Spark package myself this week and haven't yet found out how to do local publishing. If such a guide existed for Maven I could use it for sbt easily too :-

spark-packages with maven

2016-07-15 Thread Ismaël Mejía
Hello, I would like to know if there is an easy way to package a new spark-package with maven, I just found this repo, but I am not an sbt user. https://github.com/databricks/sbt-spark-package One more question, is there a formal specification or documentation of what do you need to include in a

Modify text in spark-packages

2016-02-23 Thread Sergio Ramírez
Hello, I have some problems in modifying the description of some of my packages in spark-packages.com. However, I haven't been able to change anything. I've written to the e-mail direction in charge of managing this page, but I got no answer. Any clue? Thanks --

Guidelines for writing SPARK packages

2016-02-01 Thread Praveen Devarao
Hi, Is there any guidelines or specs to write a Spark package? I would like to implement a spark package and would like to know the way it needs to be structured (implement some interfaces etc) so that it can plug into Spark for extended functionality. Could any one help me poi

Re: Insight into Spark Packages

2015-10-16 Thread Jakob Odersky
tp://spark-packages.org/api/submit-release";. Hope this helps for your last question On 16 October 2015 at 08:43, jeff saremi wrote: > I'm looking for any form of documentation on Spark Packages > Specifically, what happens when one issues a command like the following: > > > $SP

Contributing Receiver based Low Level Kafka Consumer from Spark-Packages to Apache Spark Project

2015-10-14 Thread Dibyendu Bhattacharya
kafka consumer is around for a while in spark-packages ( http://spark-packages.org/package/dibbhatt/kafka-spark-consumer ) and I see many people started using it , I am now thinking of contributing back to Apache Spark core project so that it can get better support ,visibility and adoption. Few

Re: Jcenter / bintray support for spark packages?

2015-06-10 Thread Patrick Wendell
Hey Hector, It's not a bad idea. I think we'd want to do this by virtue of allowing custom repositories, so users can add bintray or others. - Patrick On Wed, Jun 10, 2015 at 6:23 PM, Hector Yee wrote: > Hi Spark devs, > > Is it possible to add jcenter or bintray support

Jcenter / bintray support for spark packages?

2015-06-10 Thread Hector Yee
Hi Spark devs, Is it possible to add jcenter or bintray support for Spark packages? I'm trying to add our artifact which is on jcenter https://bintray.com/airbnb/aerosolve but I noticed in Spark packages it only accepts Maven coordinates. -- Yee Yang Li Hector google.com/+Hect

Spark Packages: using sbt-spark-package tool with R

2015-06-04 Thread Chris Freeman
Hey everyone, I’m looking to develop a package for use with SparkR. This package would include custom R and Scala code and I was wondering if anyone had any insight into how I might be able to use the sbt-spark-package tool to publish something that needs to include an R package as well as a JA

Re: spark packages

2015-05-24 Thread Debasish Das
Yup netlib lgpl right now is activated through a profile...if we can reuse the same idea then csparse can also be added to spark with a lgpl flag. But again as Sean said its tricky. Better to keep it on spark packages for users to try. On May 24, 2015 1:36 AM, "Sean Owen" wrote: > I

Re: spark packages

2015-05-24 Thread Sean Owen
rick Wendell wrote: > >> Yes - spark packages can include non ASF licenses. >> >> On Sat, May 23, 2015 at 6:16 PM, Debasish Das >> wrote: >> > Hi, >> > >> > Is it possible to add GPL/LGPL code on spark packages or it must be >> licensed &g

Re: spark packages

2015-05-23 Thread Reynold Xin
That's the nice thing about Spark packages. It is just a package index for libraries and applications built on top of Spark and not part of the Spark codebase, so it is not restricted to follow only ASF-compatible licenses. On Sat, May 23, 2015 at 10:12 PM, DB Tsai wrote: > I thought

Re: spark packages

2015-05-23 Thread DB Tsai
I thought LGPL is okay but GPL is not okay for Apache project. On Saturday, May 23, 2015, Patrick Wendell wrote: > Yes - spark packages can include non ASF licenses. > > On Sat, May 23, 2015 at 6:16 PM, Debasish Das > wrote: > > Hi, > > > > Is it possible to add G

Re: spark packages

2015-05-23 Thread Patrick Wendell
Yes - spark packages can include non ASF licenses. On Sat, May 23, 2015 at 6:16 PM, Debasish Das wrote: > Hi, > > Is it possible to add GPL/LGPL code on spark packages or it must be licensed > under Apache as well ? > > I want to expose Professor Tim Davis's LGPL library

spark packages

2015-05-23 Thread Debasish Das
Hi, Is it possible to add GPL/LGPL code on spark packages or it must be licensed under Apache as well ? I want to expose Professor Tim Davis's LGPL library for sparse algebra and ECOS GPL library through the package. Thanks. Deb

Re: Announcing Spark Packages

2014-12-22 Thread Nicholas Chammas
nd Management or > > designee. > > > > The title on the packages website is "A community index of packages for > > Apache Spark." Furthermore, the footnote of the website reads "Spark > > Packages is a community site hosting modules that are not part of A

Re: Announcing Spark Packages

2014-12-22 Thread Patrick Wendell
gh your > website, without written approval of the VP, Apache Brand Management or > designee. > > The title on the packages website is "A community index of packages for > Apache Spark." Furthermore, the footnote of the website reads "Spark > Packages is a commu

Re: Announcing Spark Packages

2014-12-22 Thread Nicholas Chammas
ough your website, without written approval of the VP, Apache Brand Management or designee. The title on the packages website is “A community index of packages for Apache Spark.” Furthermore, the footnote of the website reads “Spark Packages is a community site hosting modules that are not part of Apa

Re: Announcing Spark Packages

2014-12-22 Thread Hitesh Shah
ers, > > I’m happy to announce Spark Packages (http://spark-packages.org), a > community package index to track the growing number of open source > packages and libraries that work with Apache Spark. Spark Packages > makes it easy for users to find, discuss, rate, and install packages &g

Re: Announcing Spark Packages

2014-12-22 Thread Patrick Wendell
d pinging back out when the page is back up? > > Thanks! > Andrew > > On Mon, Dec 22, 2014 at 12:37 PM, Xiangrui Meng wrote: >> >> Dear Spark users and developers, >> >> I'm happy to announce Spark Packages (http://spark-packages.org), a >>

Re: Announcing Spark Packages

2014-12-22 Thread Andrew Ash
Hi Xiangrui, That link is currently returning a 503 Over Quota error message. Would you mind pinging back out when the page is back up? Thanks! Andrew On Mon, Dec 22, 2014 at 12:37 PM, Xiangrui Meng wrote: > Dear Spark users and developers, > > I’m happy to announce Spark Packa

Announcing Spark Packages

2014-12-22 Thread Xiangrui Meng
Dear Spark users and developers, I’m happy to announce Spark Packages (http://spark-packages.org), a community package index to track the growing number of open source packages and libraries that work with Apache Spark. Spark Packages makes it easy for users to find, discuss, rate, and install