This is very cool ! Thanks a lot for making this more accessible !
Best,
--
FG
On Wed, Dec 7, 2016 at 11:46 PM Jencir Lee wrote:
> Hello,
>
> We just published the Spectral LDA model on Spark Packages. It’s an
> alternative approach to the LDA modelling based on tensor decompos
Hello,
We just published the Spectral LDA model on Spark Packages. It’s an alternative
approach to the LDA modelling based on tensor decompositions. We first build
the 2nd, 3rd-moment tensors from empirical word counts, then orthogonalise them
and perform decomposition on the 3rd-moment tensor
Luciano,
afaik the spark-package-tool also makes it easy to upload packages to
spark-packages website. You are of course free to include any maven
coordinate in the --packages parameter
--jakob
On Fri, Jul 15, 2016 at 1:42 PM, Ismaël Mejía wrote:
> Thanks for the info Burak, I will check
Thanks for the info Burak, I will check the repo you mention, do you know
concretely what is the 'magic' that spark-packages need or if is there any
document with info about it ?
On Fri, Jul 15, 2016 at 10:12 PM, Luciano Resende
wrote:
>
> On Fri, Jul 15, 2016 at 10:48 AM,
ere a formal specification or documentation of
> what do
> you need to include in a spark-package (any special file, manifest, etc) ?
> I
> have not found any doc in the website.
>
> Thanks,
> Ismael
>
>
>
I was under the impression that spark-packages was more like
Hi Ismael and Jacek,
If you use Maven for building your applications, you may use the
spark-package command line tool (
https://github.com/databricks/spark-package-cmd-tool) to perform packaging.
It requires you to build your jar using maven first, and then does all the
extra magic that Spark Pack
+1000
Thanks Ismael for bringing this up! I meant to have send it earlier too
since I've been struggling with a sbt-based Scala project for a Spark
package myself this week and haven't yet found out how to do local
publishing.
If such a guide existed for Maven I could use it for sbt easily too :-
Hello, I would like to know if there is an easy way to package a new
spark-package
with maven, I just found this repo, but I am not an sbt user.
https://github.com/databricks/sbt-spark-package
One more question, is there a formal specification or documentation of what
do
you need to include in a
Hello,
I have some problems in modifying the description of some of my packages
in spark-packages.com. However, I haven't been able to change anything.
I've written to the e-mail direction in charge of managing this page,
but I got no answer.
Any clue?
Thanks
--
Hi,
Is there any guidelines or specs to write a Spark package? I would
like to implement a spark package and would like to know the way it needs
to be structured (implement some interfaces etc) so that it can plug into
Spark for extended functionality.
Could any one help me poi
tp://spark-packages.org/api/submit-release";.
Hope this helps for your last question
On 16 October 2015 at 08:43, jeff saremi wrote:
> I'm looking for any form of documentation on Spark Packages
> Specifically, what happens when one issues a command like the following:
>
>
> $SP
kafka consumer is around for a while in spark-packages (
http://spark-packages.org/package/dibbhatt/kafka-spark-consumer ) and I see
many people started using it , I am now thinking of contributing back to
Apache Spark core project so that it can get better support ,visibility and
adoption.
Few
Hey Hector,
It's not a bad idea. I think we'd want to do this by virtue of
allowing custom repositories, so users can add bintray or others.
- Patrick
On Wed, Jun 10, 2015 at 6:23 PM, Hector Yee wrote:
> Hi Spark devs,
>
> Is it possible to add jcenter or bintray support
Hi Spark devs,
Is it possible to add jcenter or bintray support for Spark packages?
I'm trying to add our artifact which is on jcenter
https://bintray.com/airbnb/aerosolve
but I noticed in Spark packages it only accepts Maven coordinates.
--
Yee Yang Li Hector
google.com/+Hect
Hey everyone,
I’m looking to develop a package for use with SparkR. This package would
include custom R and Scala code and I was wondering if anyone had any insight
into how I might be able to use the sbt-spark-package tool to publish something
that needs to include an R package as well as a JA
Yup netlib lgpl right now is activated through a profile...if we can reuse
the same idea then csparse can also be added to spark with a lgpl flag. But
again as Sean said its tricky. Better to keep it on spark packages for
users to try.
On May 24, 2015 1:36 AM, "Sean Owen" wrote:
> I
rick Wendell wrote:
>
>> Yes - spark packages can include non ASF licenses.
>>
>> On Sat, May 23, 2015 at 6:16 PM, Debasish Das
>> wrote:
>> > Hi,
>> >
>> > Is it possible to add GPL/LGPL code on spark packages or it must be
>> licensed
&g
That's the nice thing about Spark packages. It is just a package index for
libraries and applications built on top of Spark and not part of the Spark
codebase, so it is not restricted to follow only ASF-compatible licenses.
On Sat, May 23, 2015 at 10:12 PM, DB Tsai wrote:
> I thought
I thought LGPL is okay but GPL is not okay for Apache project.
On Saturday, May 23, 2015, Patrick Wendell wrote:
> Yes - spark packages can include non ASF licenses.
>
> On Sat, May 23, 2015 at 6:16 PM, Debasish Das > wrote:
> > Hi,
> >
> > Is it possible to add G
Yes - spark packages can include non ASF licenses.
On Sat, May 23, 2015 at 6:16 PM, Debasish Das wrote:
> Hi,
>
> Is it possible to add GPL/LGPL code on spark packages or it must be licensed
> under Apache as well ?
>
> I want to expose Professor Tim Davis's LGPL library
Hi,
Is it possible to add GPL/LGPL code on spark packages or it must be
licensed under Apache as well ?
I want to expose Professor Tim Davis's LGPL library for sparse algebra and
ECOS GPL library through the package.
Thanks.
Deb
nd Management or
> > designee.
> >
> > The title on the packages website is "A community index of packages for
> > Apache Spark." Furthermore, the footnote of the website reads "Spark
> > Packages is a community site hosting modules that are not part of A
gh your
> website, without written approval of the VP, Apache Brand Management or
> designee.
>
> The title on the packages website is "A community index of packages for
> Apache Spark." Furthermore, the footnote of the website reads "Spark
> Packages is a commu
ough your
website, without written approval of the VP, Apache Brand Management or
designee.
The title on the packages website is “A community index of packages for
Apache Spark.” Furthermore, the footnote of the website reads “Spark
Packages is a community site hosting modules that are not part of Apa
ers,
>
> I’m happy to announce Spark Packages (http://spark-packages.org), a
> community package index to track the growing number of open source
> packages and libraries that work with Apache Spark. Spark Packages
> makes it easy for users to find, discuss, rate, and install packages
&g
d pinging back out when the page is back up?
>
> Thanks!
> Andrew
>
> On Mon, Dec 22, 2014 at 12:37 PM, Xiangrui Meng wrote:
>>
>> Dear Spark users and developers,
>>
>> I'm happy to announce Spark Packages (http://spark-packages.org), a
>>
Hi Xiangrui,
That link is currently returning a 503 Over Quota error message. Would you
mind pinging back out when the page is back up?
Thanks!
Andrew
On Mon, Dec 22, 2014 at 12:37 PM, Xiangrui Meng wrote:
> Dear Spark users and developers,
>
> I’m happy to announce Spark Packa
Dear Spark users and developers,
I’m happy to announce Spark Packages (http://spark-packages.org), a
community package index to track the growing number of open source
packages and libraries that work with Apache Spark. Spark Packages
makes it easy for users to find, discuss, rate, and install
28 matches
Mail list logo