Hi Dongjoon,
thanks for confirmation.
I have added the Apache release repository to my project, so it fetches
the jars from there and not Maven central.
That is a great workaround until Maven central has resolved the issue.
Cheers,
Enrico
Am 19.04.23 um 03:04 schrieb Dongjoon Hyun:
Thank
Thank you for reporting, Enrico.
I verified your issue report and also double-checked that both the original
official Apache repository and Google Maven Mirror works correctly. Given that,
it could be due to some transient issues because the artifacts are copied from
Apache repository to Maven
Any suggestions on how to fix or use the Spark 3.2.4 (Scala 2.13) release?
Cheers,
Enrico
Am 17.04.23 um 08:19 schrieb Enrico Minack:
Hi,
thanks for the Spark 3.2.4 release.
I have found that Maven does not serve the spark-parent_2.13 pom file.
It is listed in the directory:
https://repo1
Hi,
thanks for the Spark 3.2.4 release.
I have found that Maven does not serve the spark-parent_2.13 pom file.
It is listed in the directory:
https://repo1.maven.org/maven2/org/apache/spark/spark-parent_2.13/3.2.4/
But cannot be downloaded:
https://repo1.maven.org/maven2/org/apache/spark
i think the scala plugin upgrade may be good, but the bouncy castle one
needed. 1.70 is the most recent, afaik.
On Thu, 8 Dec 2022 at 06:59, Yang,Jie(INF) wrote:
> Steve, after some investigate, I think this problem may not related to
> `scala-maven-plugin`. We can add the following tw
Steve, after some investigate, I think this problem may not related to
`scala-maven-plugin`. We can add the following two test dependencies to the
`sql/core` module to make the mvn build successful:
```
org.bouncycastle
bcprov-jdk15on
test
org.bouncycastle
I think we can try scala-maven-plugin 4.8.0
发件人: Steve Loughran
日期: 2022年12月6日 星期二 18:19
收件人: "Yang,Jie(INF)"
抄送: Hyukjin Kwon , Apache Spark Dev
主题: Re: maven build failing in spark sql w/BouncyCastleProvider CNFE
On Tue, 6 Dec 2022 at 04:10, Yang,Jie(INF)
mailto:yangji...
he 4.8.0 plugin would be worth trying...not something i'll do
this week as i'm really trying to get the RC0 out rather than anything else
>
>
> *发件人**: *Hyukjin Kwon
> *日期**: *2022年12月6日 星期二 10:27
> *抄送**: *Apache Spark Dev
> *主题**: *Re: maven build failing in spark sql
Steve, did compile failed happen when mvn build Spark master with hadoop
3.4.0-SNAPSHOT?
发件人: Hyukjin Kwon
日期: 2022年12月6日 星期二 10:27
抄送: Apache Spark Dev
主题: Re: maven build failing in spark sql w/BouncyCastleProvider CNFE
Steve, does the lower version of scala plugin work for you? If that
Steve, does the lower version of scala plugin work for you? If that solves,
we could temporary downgrade for now.
On Mon, 5 Dec 2022 at 22:23, Steve Loughran
wrote:
> trying to build spark master w/ hadoop trunk and the maven sbt plugin is
> failing. This doesn't happen with th
trying to build spark master w/ hadoop trunk and the maven sbt plugin is
failing. This doesn't happen with the 3.3.5 RC0;
I note that the only mention of this anywhere was me in march.
clearly something in hadoop trunk has changed in a way which is
incompatible.
Has anyone else tried s
Hi, team.
I run the maven command to run unit test, and have a NPE.
command: ./build/mvn test
refer to
https://spark.apache.org/docs/latest/building-spark.html#running-tests
NPE is as follow:
22/05/20 16:32:45.450 main WARN AbstractChannelHandlerContext: Failed to
mark a promise as failure
I've found the problem!
It was indeed a local thingy!
$ cat ~/.mavenrc
MAVEN_OPTS='-XX:+TieredCompilation -XX:TieredStopAtLevel=1'
I've added this some time ago. It optimizes the build time. But it seems it
also overrides the env var MAVEN_OPTS...
Now it fails with:
[I
ite (CallSite.java:324)
>>> at java.lang.invoke.MethodHandleNatives.linkCallSiteImpl
>>> (MethodHandleNatives.java:307)
>>> at java.lang.invoke.MethodHandleNatives.linkCallSite
>>> (MethodHandleNatives.java:297)
>>> at scala.tools.nsc.
andleNatives.java:297)
>> at scala.tools.nsc.typechecker.Typers$Typer.typedBlock
>> (Typers.scala:2504)
>> at scala.tools.nsc.typechecker.Typers$Typer.$anonfun$typed1$103
>> (Typers.scala:5711)
>> at scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1
>> (Typers.
echecker.Typers$Typer.typedBlock
> (Typers.scala:2504)
> at scala.tools.nsc.typechecker.Typers$Typer.$anonfun$typed1$103
> (Typers.scala:5711)
> at scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1
> (Typers.scala:500)
> at scala.tools.nsc.typechecker.T
(Typers.scala:5746)
at scala.tools.nsc.typechecker.Typers$Typer.typed (Typers.scala:5781)
I have played a lot with the scala-maven-plugin jvmArg settings at [1] but
so far nothing helps.
Same error for Scala 2.12 and 2.13.
The command I use is: ./build/mvn install -Pkubernetes -DskipTests
I need to
taFile()), and the need
> for those shims goes away.
>
> What the module also does is import the relevant hadoop-aws, hadoop-azure
> modules etc and strip out anything which complicates life. When published
> to the maven repo then, apps can import it downstream and get a consisten
ose shims goes away.
What the module also does is import the relevant hadoop-aws, hadoop-azure
modules etc and strip out anything which complicates life. When published
to the maven repo then, apps can import it downstream and get a consistent
set of hadooop-* artifacts, and the AWS artifacts which th
Hi, All.
Apache Spark community starts to publish Scala 2.13 Maven artifacts daily.
https://repository.apache.org/content/repositories/snapshots/org/apache/spark/spark-core_2.13/3.2.0-SNAPSHOT/
It aims to encourage more tests on Scala 2.13 (and Scala 3) and to identify
issues in advance for
+1 for proposal.
Tom
On Tuesday, January 21, 2020, 04:37:04 PM CST, Sean Owen
wrote:
See https://github.com/apache/spark/pull/27307 for some context. We've
had to add, in at least one place, some settings to resolve artifacts
from a mirror besides Maven Central to work around
repo in the build, falling back to Central if
> needed.
>
> Thanks,
> Dongjoon.
>
>
> On Tue, Jan 21, 2020 at 14:37 Sean Owen wrote:
>
>> See https://github.com/apache/spark/pull/27307 for some context. We've
>> had to add, in at least one place, some settings t
#x27;ve
>> had to add, in at least one place, some settings to resolve artifacts
>> from a mirror besides Maven Central to work around some build
>> problems.
>>
>> Now, we find it might be simpler to just use this mirror
to add, in at least one place, some settings to resolve artifacts
> from a mirror besides Maven Central to work around some build
> problems.
>
> Now, we find it might be simpler to just use this mirror as the
> primary repo in the build, falling back to Central if needed.
>
>
See https://github.com/apache/spark/pull/27307 for some context. We've
had to add, in at least one place, some settings to resolve artifacts
from a mirror besides Maven Central to work around some build
problems.
Now, we find it might be simpler to just use this mirror as the
primary repo i
via
> https://github.com/apache/spark/pull/23061 at 3.0.0.
>
> According to the stack trace in JIRA and PR description, `maven-shade-plugin`
> seems to be the root cause.
>
> For now, I'd like to recommend you to disable it because `Maven` itself warns
/jira/browse/SPARK-26095 .
>
> And, we disabled the parallel build via
> https://github.com/apache/spark/pull/23061 at 3.0.0.
>
> According to the stack trace in JIRA and PR description,
> `maven-shade-plugin` seems to be the root cause.
>
> For now, I'd like to recommend
Hi, Saurabh.
It seems that you are hitting
https://issues.apache.org/jira/browse/SPARK-26095 .
And, we disabled the parallel build via
https://github.com/apache/spark/pull/23061 at 3.0.0.
According to the stack trace in JIRA and PR description,
`maven-shade-plugin` seems to be the root cause
Hi Sean,
Thanks for checking this.
I am able to see parallel build info in the readme file
https://github.com/apache/spark#building-spark
"
You can build Spark using more than one thread by using the -T option with
Maven, see "Parallel builds in Maven 3"
<https://cwiki.apa
I don't believe you can use a parallel build indeed. Some things
collide with each other. Some of the suites are run in parallel inside
the build though already.
On Fri, Jan 17, 2020 at 1:23 PM Saurabh Chawla wrote:
>
> Hi All,
>
> Spark master build hangs using parallel buil
Hi All,
Spark master build hangs using parallel build option in maven. On running
build the sequentially on spark master using maven, build did not hang.
This issue occurs on giving hadoop-provided (*-Phadoop-provided
-Dhadoop.version=2.8.5) *option. Same command works fine to build
spark-2.4.3
ng an hour.
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-maven-hadoop-3.2/lastStableBuild/testReport/org.apache.spark.sql/SQLQueryTestSuite/
Well, at least it seems like we legitimately need these full
integration tests to run that long, and that's just life right now, so
8 hours
just chatted w/sean privately and i'm going to up the test timeouts to
480mins (8 hours).
i still don't like this but at least it should hopefully get things green again.
On Mon, Oct 7, 2019 at 11:31 AM Shane Knapp wrote:
>
> https://amplab.cs.berkeley.edu/jenkins/job/spark-m
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-maven-hadoop-2.7/
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-maven-hadoop-2.7-ubuntu-testing/
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-maven-hadoop-3.2/
https://amplab.cs.berkeley.edu/jenkins/job
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-maven-snapshots/
https://amplab.cs.berkeley.edu/jenkins/job/spark-branch-2.4-maven-snapshots/
i created dry-run test builds and everything looked great. please
file a JIRA is anything published by these jobs looks fishy.
shane
--
Shane
(Hmm, what is spark-...@apache.org?)
From: Sean Owen
Sent: Tuesday, September 3, 2019 11:58:30 AM
To: Xiao Li
Cc: Tom Graves ; spark-...@apache.org
Subject: Re: maven 3.6.1 removed from apache maven repo
It's because build/mvn only queries ASF mirrors
k the build, I merged the upgrade to master.
> https://github.com/apache/spark/pull/25665
>
> Thanks!
>
> Xiao
>
>
> On Tue, Sep 3, 2019 at 10:58 AM Tom Graves
> wrote:
>
>> It looks like maven 3.6.1 was removed from the repo - see SPARK-28960.
>> It lo
Hi, Tom,
To unblock the build, I merged the upgrade to master.
https://github.com/apache/spark/pull/25665
Thanks!
Xiao
On Tue, Sep 3, 2019 at 10:58 AM Tom Graves
wrote:
> It looks like maven 3.6.1 was removed from the repo - see SPARK-28960. It
> looks like they pushed 3.6.2, but I
It looks like maven 3.6.1 was removed from the repo - see SPARK-28960. It
looks like they pushed 3.6.2, but I don't see any release notes on the maven
page for it 3.6.2
Seems like we had this happen before, can't remember if it was maven or
something else, anyone remember or know i
Thanks! Yuming and Gengliang are working on this.
On Thu, May 30, 2019 at 8:21 AM Sean Owen wrote:
> I might need some help figuring this out. The master Maven build has
> been failing for almost a week, and I'm having trouble diagnosing why.
> Of course, the PR builder
I might need some help figuring this out. The master Maven build has
been failing for almost a week, and I'm having trouble diagnosing why.
Of course, the PR builder has been fine.
First one seems to be:
https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/
This might be useful to do.
BTW, based on my experience with different build systems in the past few years
(extensively SBT/Maven/Bazel, and to a less extent Gradle/Cargo), I think the
longer term solution is to move to Bazel. It is so much easier to understand
and use, and also much more
le to handle R and Python modules built
> by this project ? The home grown solution here does I think and that is
> helpful.
>
> On Sat, Jan 26, 2019, 6:57 AM vaclavkosar
>> I think it would be good idea to use gitflow-incremental-builder maven
>> plugin for Spark builds. I
Sounds interesting; would it be able to handle R and Python modules built
by this project ? The home grown solution here does I think and that is
helpful.
On Sat, Jan 26, 2019, 6:57 AM vaclavkosar I think it would be good idea to use gitflow-incremental-builder maven
> plugin for Spark bui
I think it would be good idea to use gitflow-incremental-builder maven
plugin for Spark builds. It saves resources by building only modules
that are impacted by changes compared to git master branch via
gitflow-incremental-builder maven plugin. For example if there is only a
change introduced
stom - Spark as dependency while
> I am using maven to compile my applications ?
>
> Thanks for your reply,
> --Iacovos
>
> -
> To unsubscribe e-mail:
Hello,
is there any way to use my local custom - Spark as dependency while
I am using maven to compile my applications ?
Thanks for your reply,
--Iacovos
-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
lot of this was possible in maven/gradle but did not want to go
through the hackage required to get it working.
On Mon, 5 Mar 2018 at 09:49 Sean Owen wrote:
> Spark uses Maven as the primary build, but SBT works as well. It reads the
> Maven build to some extent.
>
> Zinc incremental
Spark uses Maven as the primary build, but SBT works as well. It reads the
Maven build to some extent.
Zinc incremental compilation works with Maven (with the Scala plugin for
Maven).
Myself, I prefer Maven, for some of the reasons it is the main build in
Spark: declarative builds end up being a
I think most of the scala development in Spark happens with sbt - in the open
source world.
However, you can do it with Gradle and Maven as well. It depends on your
organization etc. what is your standard.
Some things might be more cumbersome too reach in non-sbt scala scenarios, but
this is
rementError$.notFound(
>> MissingRequirementError.scala:18)
>> at scala.reflect.internal.Mirrors$RootsBase.
>> getModuleOrClass(Mirrors.scala:53)
>>
>> is it perhaps compability issue? Versions I use are as follows:
>>
>> ➜ spark git:(master) ✗ ./build/mvn
singRequirementError$.notFound(MissingRequirementError.scala:18)
> at
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:53)
>
> is it perhaps compability issue? Versions I use are as follows:
>
> ➜ spark git:(master) ✗ ./build/mvn --version
> Using `mvn` fr
`mvn` from path:
/Users/tdudek/Programming/spark/build/apache-maven-3.3.9/bin/mvn
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5;
2015-11-10T17:41:47+01:00)
Maven home: /Users/tdudek/Programming/spark/build/apache-maven-3.3.9
Java version: 1.8.0_152, vendor: Oracle Corporation
Java home
I think the reason you're seeing this (and it then disappearing in Sean's
case) is likely that there was a change in another that required a
recompile of a module dependency.
Maven doesn't do this automatically by default. So it eventually goes away
when you do a full build eithe
I saw this too yesterday but not today. It may have been fixed by some
recent commits.
On Mon, Feb 20, 2017 at 6:52 PM ron8hu wrote:
I am using Intellij IDEA 15.0.6. I used to use Maven to compile Spark
project Catalyst inside Intellij without problem.
A couple of days ago, I fetched latest
I am using Intellij IDEA 15.0.6. I used to use Maven to compile Spark
project Catalyst inside Intellij without problem.
A couple of days ago, I fetched latest Spark code from its master
repository. There was a change in CreateJacksonParser.scala. So I used
Maven to compile Catalyst project
Was there any error prior to 'LifecycleExecutionException' ?
On Fri, Sep 30, 2016 at 2:43 PM, satyajit vegesna <
satyajit.apas...@gmail.com> wrote:
>
>> i am trying to compile code using maven ,which was working with spark
>> 1.6.2, but when i try for spa
>
>
> i am trying to compile code using maven ,which was working with spark
> 1.6.2, but when i try for spark 2.0.0 then i get below error,
>
> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute
> goal net.alchim31.maven:scala-maven-plugin:3.2.2
sifier, don't you think ?
For now I've overriden them myself using the dependency versions defined in the
pom.xml of spark.
So it's not a blocker issue, it may be useful to document it, but a blog post
would be sufficient I think.
The problem here is that it's not directly someth
Hi ALL,
i am trying to compile code using maven ,which was working with spark
1.6.2, but when i try for spark 2.0.0 then i get below error,
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute
goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (default) on
project
No, I think that's what dependencyManagent (or equivalent) is definitely for.
On Thu, Sep 29, 2016 at 5:37 AM, Olivier Girardot
wrote:
> I know that the code itself would not be the same, but it would be useful to
> at least have the pom/build.sbt transitive dependencies different when
> fetching
ier Girardot <
o.girar...@lateral-thoughts.com> wrote:
Hi,when we fetch Spark 2.0.0 as maven dependency then we automatically end up
with hadoop 2.2 as a transitive dependency, I know multiple profiles are used to
generate the different tar.gz bundles that we can download, Is there by any
chance p
be binary-compatible
>> with anything 2.2+. So you merely manage the dependency versions up to the
>> desired version in your .
>>
>> On Thu, Sep 22, 2016 at 7:05 AM, Olivier Girardot <
>> o.girar...@lateral-thoughts.com> wrote:
>>
>> Hi,
>> when
:05 AM, Olivier Girardot <
o.girar...@lateral-thoughts.com> wrote:
Hi,when we fetch Spark 2.0.0 as maven dependency then we automatically end up
with hadoop 2.2 as a transitive dependency, I know multiple profiles are used to
generate the different tar.gz bundles that we can download, I
ardot <
o.girar...@lateral-thoughts.com> wrote:
> Hi,
> when we fetch Spark 2.0.0 as maven dependency then we automatically end up
> with hadoop 2.2 as a transitive dependency, I know multiple profiles are
> used to generate the different tar.gz bundles that we can download, I
Hi,when we fetch Spark 2.0.0 as maven dependency then we automatically end up
with hadoop 2.2 as a transitive dependency, I know multiple profiles are used to
generate the different tar.gz bundles that we can download, Is there by any
chance publications of Spark 2.0.0 with different classifier
Thank you all for quick fix! :D
Dongjoon.
On Tuesday, August 30, 2016, Michael Gummelt wrote:
> https://github.com/apache/spark/pull/14885
>
> Thanks
>
> On Tue, Aug 30, 2016 at 11:36 AM, Marcelo Vanzin > wrote:
>
>> On Tue, Aug 30, 2016 at 11:32 AM, Sean Owen > > wrote:
>> > Ah, I helped miss
https://github.com/apache/spark/pull/14885
Thanks
On Tue, Aug 30, 2016 at 11:36 AM, Marcelo Vanzin
wrote:
> On Tue, Aug 30, 2016 at 11:32 AM, Sean Owen wrote:
> > Ah, I helped miss that. We don't enable -Pyarn for YARN because it's
> > already always set? I wonder if it makes sense to make tha
On Tue, Aug 30, 2016 at 11:32 AM, Sean Owen wrote:
> Ah, I helped miss that. We don't enable -Pyarn for YARN because it's
> already always set? I wonder if it makes sense to make that optional
> in order to speed up builds, or, maybe I'm missing a reason it's
> always essential.
YARN is currently
Ah, I helped miss that. We don't enable -Pyarn for YARN because it's
already always set? I wonder if it makes sense to make that optional
in order to speed up builds, or, maybe I'm missing a reason it's
always essential.
I think it's not setting -Pmesos indeed because no Mesos code was
changed but
is that it uses the
> >>>> pluggable interface for TaskScheduler and SchedulerBackend (as
> >>>> introduced by YARN). Think Standalone should follow the steps. WDYT?
> >>>>
> >>>> Pozdrawiam,
> >>>> Jacek Laskowski
> >>>>
dalone should follow the steps. WDYT?
>>>>
>>>> Pozdrawiam,
>>>> Jacek Laskowski
>>>>
>>>> https://medium.com/@jaceklaskowski/
>>>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>>>> Follow me
s
>>> introduced by YARN). Think Standalone should follow the steps. WDYT?
>>>
>>> Pozdrawiam,
>>> Jacek Laskowski
>>>
>>> https://medium.com/@jaceklaskowski/
>>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>&
T?
>>>
>>> Pozdrawiam,
>>> Jacek Laskowski
>>>
>>> https://medium.com/@jaceklaskowski/
>>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>>> Follow me at https://twitter.com/jaceklaskowski
>>>
>>>
>>>
> Jacek Laskowski
>>
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Fri, Aug 26, 2016 at 10:20 PM, Michael Gummelt
>> w
;
> On Fri, Aug 26, 2016 at 10:20 PM, Michael Gummelt
> wrote:
> > Hello devs,
> >
> > Much like YARN, Mesos has been refactored into a Maven module. So when
> > building, you must add "-Pmesos" to enable Mesos support.
> >
> > The pre-built distri
Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Fri, Aug 26, 2016 at 10:20 PM, Michael Gummelt
wrote:
> Hello devs,
>
> Much like YARN, Mesos has been refactored into a Maven module. So when
> building, you must add "-
This is great!
On Fri, Aug 26, 2016 at 1:20 PM, Michael Gummelt
wrote:
> Hello devs,
>
> Much like YARN, Mesos has been refactored into a Maven module. So when
> building, you must add "-Pmesos" to enable Mesos support.
>
> The pre-built distributions from Apache w
Hello devs,
Much like YARN, Mesos has been refactored into a Maven module. So when
building, you must add "-Pmesos" to enable Mesos support.
The pre-built distributions from Apache will continue to enable Mesos.
PR: https://github.com/apache/spark/pull/14637
Cheers
--
Micha
Thanks Sean. That reflects my sentiments so well!
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Mon, Aug 15, 2016 at 1:08 AM, Sean Owen wrote:
> I believe Ch
As well as the legal issue 'nightly builds haven't been through the strict
review and license check process for ASF releases', and the engineering issue
'release off a nightly and your users will hate you', there's an ASF community
one: ASF projects want to build a dev community as well as a us
I believe Chris was being a bit facetious.
The ASF guidance is right, that it's important people don't consume
non-blessed snapshot builds as like other releases. The intended
audience is developers and so the easiest default policy is to only
advertise the snapshots where only developers are like
8Pw
>>>>
>>>>
>>>>
>>>> http://talebzadehmich.wordpress.com
>>>>
>>>>
>>>> Disclaimer: Use it at your own risk. Any and all responsibility for any
>>>> loss, damage or destruction of data or any other pr
gt;> arise from relying on this email's technical content is explicitly
>>> disclaimed. The author will in no case be liable for any monetary damages
>>> arising from such loss, damage or destruction.
>>>
>>>
>>>
>>> On 9 August 2016 at
Luciano,
afaik the spark-package-tool also makes it easy to upload packages to
spark-packages website. You are of course free to include any maven
coordinate in the --packages parameter
--jakob
On Fri, Jul 15, 2016 at 1:42 PM, Ismaël Mejía wrote:
> Thanks for the info Burak, I will check
Jacek Laskowski wrote:
>
>> +1000
>>
>> Thanks Ismael for bringing this up! I meant to have send it earlier too
>> since I've been struggling with a sbt-based Scala project for a Spark
>> package myself this week and haven't yet found out how to do local
>
to do local
> publishing.
>
> If such a guide existed for Maven I could use it for sbt easily too :-)
>
> Ping me Ismael if you don't hear back from the group so I feel invited for
> digging into the plugin's sources.
>
> Best,
> Jacek
>
> On 15 Jul 2016
Hi Ismael and Jacek,
If you use Maven for building your applications, you may use the
spark-package command line tool (
https://github.com/databricks/spark-package-cmd-tool) to perform packaging.
It requires you to build your jar using maven first, and then does all the
extra magic that Spark
+1000
Thanks Ismael for bringing this up! I meant to have send it earlier too
since I've been struggling with a sbt-based Scala project for a Spark
package myself this week and haven't yet found out how to do local
publishing.
If such a guide existed for Maven I could use it for sbt
Hello, I would like to know if there is an easy way to package a new
spark-package
with maven, I just found this repo, but I am not an sbt user.
https://github.com/databricks/sbt-spark-package
One more question, is there a formal specification or documentation of what
do
you need to include in a
As far as I know the process is just to copy docs/_site from the build
to the appropriate location in the SVN repo (i.e.
site/docs/2.0.0-preview).
Thanks
Shivaram
On Tue, Jun 7, 2016 at 8:14 AM, Sean Owen wrote:
> As a stop-gap, I can edit that page to have a small section about
> preview releas
Thanks Sean, you were right, hard refresh made it show up.
Seems like we should at least link to the preview docs from
http://spark.apache.org/documentation.html.
Tom
On Tuesday, June 7, 2016 10:04 AM, Sean Owen wrote:
It's there (refresh maybe?). See the end of the downloads dropdown.
As a stop-gap, I can edit that page to have a small section about
preview releases and point to the nightly docs.
Not sure who has the power to push 2.0.0-preview to site/docs, but, if
that's done then we can symlink "preview" in that dir to it and be
done, and update this section about preview do
It's there (refresh maybe?). See the end of the downloads dropdown.
For the moment you can see the docs in the nightly docs build:
https://home.apache.org/~pwendell/spark-nightly/spark-branch-2.0-docs/latest/
I don't know, what's the best way to put this into the main site?
under a /preview root?
be as easy to
use/try out as any other spark release.
Tom
On Monday, June 6, 2016 3:00 PM, Imran Rashid wrote:
I've been a bit on the fence on this, but I agree that Luciano makes a
compelling reason for why we really should publish things to maven central.
Sure we slightly in
I've been a bit on the fence on this, but I agree that Luciano makes a
compelling reason for why we really should publish things to maven
central. Sure we slightly increase the risk somebody refers to the preview
release too late, but really that is their own fault.
And I also I agree
On Mon, Jun 6, 2016 at 12:05 PM, Reynold Xin wrote:
> The bahir one was a good argument actually. I just clicked the button to
> push it into Maven central.
>
>
Thank You !!!
The bahir one was a good argument actually. I just clicked the button to
push it into Maven central.
On Mon, Jun 6, 2016 at 12:00 PM, Mark Hamstra
wrote:
> Fine. I don't feel strongly enough about it to continue to argue against
> putting the artifacts on Maven Central.
>
&
Fine. I don't feel strongly enough about it to continue to argue against
putting the artifacts on Maven Central.
On Mon, Jun 6, 2016 at 11:48 AM, Sean Owen wrote:
> Artifacts can't be removed from Maven in any normal circumstance, but,
> it's no problem.
>
> The argu
Artifacts can't be removed from Maven in any normal circumstance, but,
it's no problem.
The argument that people might keep using it goes for any older
release. Why would anyone use 1.6.0 when 1.6.1 exists? yet we keep
1.6.0 just for the record and to not break builds. It may be that
1 - 100 of 296 matches
Mail list logo