++ DEV community
On Mon, Jul 17, 2023 at 4:14 PM Varun Shah
wrote:
> Resending this message with a proper Subject line
>
> Hi Spark Community,
>
> I am trying to set up my forked apache/spark project locally for my 1st
> Open Source Contribution, by building and creating a package as mentioned
The error is right there. Just read the output more carefully.
On Wed, Feb 24, 2016 at 11:37 AM, Minudika Malshan
wrote:
> [INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-versions) @
> spark-parent_2.11 ---
> [WARNING] Rule 0: org.apache.maven.plugins.enforcer.RequireMavenVersion
> failed
Here is the full stack trace..
@Yin : yeah it seems like a problem with maven version. I am going to
update maven.
@ Marcelo : Yes, couldn't decide what's wrong at first :)
Thanks for your help!
[INFO] Scanning for projects...
[INFO]
-
I encountered similar warning recently.
Please check the version of maven you're using: it should be 3.3.9
On Wed, Feb 24, 2016 at 11:29 AM, Marcelo Vanzin
wrote:
> Well, did you do what the message instructed you to do and looked
> above the message you copied for more specific messages for wh
Well, did you do what the message instructed you to do and looked
above the message you copied for more specific messages for why the
build failed?
On Wed, Feb 24, 2016 at 11:28 AM, Minudika Malshan
wrote:
> Hi,
>
> I am trying to build from spark source code which was cloned from
> https://githu
Hi,
I am trying to build from spark source code which was cloned from
https://github.com/apache/spark.git.
But it fails with following error.
[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-enforcer-plugin:1.4.1:enforce
(enforce-versions) on project spark-parent_2.11: Some Enforcer
Worked for me. Thanks!
Pozdrawiam,
Jacek
--
Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
Follow me at https://twitter.com/jaceklaskowski
Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
On Sat, Nov 7, 2015 at 1:56 PM, Ted Yu wrote:
> Created a PR for
Created a PR for the compilation error:
https://github.com/apache/spark/pull/9538
Cheers
On Sat, Nov 7, 2015 at 4:41 AM, Jacek Laskowski wrote:
> Hi,
>
> Checked out the latest sources and the build failed:
>
> [error]
> /Users/jacek/dev/oss/spark/core/src/main/scala/org/apache/spark/storage/RD
Hi,
Checked out the latest sources and the build failed:
[error]
/Users/jacek/dev/oss/spark/core/src/main/scala/org/apache/spark/storage/RDDInfo.scala:25:
in class RDDInfo, multiple overloaded alternatives of constructor
RDDInfo define default arguments.
[error] class RDDInfo(
[error] ^
T
> On 6 Nov 2015, at 17:35, Marcelo Vanzin wrote:
>
> On Fri, Nov 6, 2015 at 2:21 AM, Steve Loughran wrote:
>> Maven's closest-first policy has a different flaw, namely that its not
>> always obvious why a guava 14.0 that is two hops of transitiveness should
>> take priority over a 16.0 versio
On Fri, Nov 6, 2015 at 2:21 AM, Steve Loughran wrote:
> Maven's closest-first policy has a different flaw, namely that its not always
> obvious why a guava 14.0 that is two hops of transitiveness should take
> priority over a 16.0 version three hops away. Especially when that 0.14
> version sho
Since maven is the preferred build vehicle, ivy style dependencies policy
would produce surprising results compared to today's behavior.
I would suggest staying with current dependencies policy.
My two cents.
On Fri, Nov 6, 2015 at 6:25 AM, Koert Kuipers wrote:
> if there is no strong preferen
if there is no strong preference for one dependencies policy over another,
but consistency between the 2 systems is desired, then i believe maven can
be made to behave like ivy pretty easily with a setting in the pom
On Fri, Nov 6, 2015 at 5:21 AM, Steve Loughran
wrote:
>
> > On 5 Nov 2015, at 2
> On 5 Nov 2015, at 20:07, Marcelo Vanzin wrote:
>
> Man that command is slow. Anyway, it seems guava 16 is being brought
> transitively by curator 2.6.0 which should have been overridden by the
> explicit dependency on curator 2.4.0, but apparently, as Steve
> mentioned, sbt/ivy decided to brea
;>>> One other thing, i was able to build fine with the above command up until
>>>>> recently. I think i have stared
>>>>> to have problem after SPARK-11073 where the HashCodes import was added.
>>>>>
>>>>> Regards,
>>>>
thing, i was able to build fine with the above command up until
>>>> recently. I think i have stared
>>>> to have problem after SPARK-11073 where the HashCodes import was added.
>>>>
>>>> Regards,
>>>> Dilip Biswal
>>>> Tel: 408-
>
>>> Regards,
>>> Dilip Biswal
>>> Tel: 408-463-4980
>>> dbis...@us.ibm.com
>>>
>>>
>>>
>>> From:Ted Yu
>>> To:Dilip Biswal/Oakland/IBM@IBMUS
>>> Cc:Jean-Baptiste Onofré , "
hCodes import was added.
>>
>> Regards,
>> Dilip Biswal
>> Tel: 408-463-4980
>> dbis...@us.ibm.com
>>
>>
>>
>> From:Ted Yu
>> To:Dilip Biswal/Oakland/IBM@IBMUS
>> Cc:Jean-Baptiste Onofré , "dev@spar
lip Biswal
> Tel: 408-463-4980
> dbis...@us.ibm.com
>
>
>
> From:Ted Yu
> To:Dilip Biswal/Oakland/IBM@IBMUS
> Cc:Jean-Baptiste Onofré , "dev@spark.apache.org"
>
> Date:11/05/2015 10:46 AM
> Subject:Re: Master build fail
Date: 11/05/2015 10:46 AM
Subject:Re: Master build fails ?
Dilip:
Can you give the command you used ?
Which release were you building ?
What OS did you build on ?
Cheers
On Thu, Nov 5, 2015 at 10:21 AM, Dilip Biswal wrote:
Hello,
I am getting the same build error about not being ab
SBT/ivy pulls in the most recent version of a JAR in, whereas maven pulls in
the "closest", where closest is lowest distance/depth from the root.
> On 5 Nov 2015, at 18:53, Marcelo Vanzin wrote:
>
> Seems like it's an sbt issue, not a maven one, so "dependency:tree"
> might not help. Still, th
Seems like it's an sbt issue, not a maven one, so "dependency:tree"
might not help. Still, the command line would be helpful. I use sbt
and don't see this.
On Thu, Nov 5, 2015 at 10:44 AM, Marcelo Vanzin wrote:
> Hi Jeff,
>
> On Tue, Nov 3, 2015 at 2:50 AM, Jeff Zhang wrote:
>> Looks like it's d
t;
> Is there a solution to this ?
>
> Regards,
> Dilip Biswal
> Tel: 408-463-4980
> dbis...@us.ibm.com
>
>
>
> From:Jean-Baptiste Onofré
> To:Ted Yu
> Cc:"dev@spark.apache.org"
> Date:
Hi Jeff,
On Tue, Nov 3, 2015 at 2:50 AM, Jeff Zhang wrote:
> Looks like it's due to guava version conflicts, I see both guava 14.0.1 and
> 16.0.1 under lib_managed/bundles. Anyone meet this issue too ?
What command line are you using to build? Can you run "mvn
dependency:tree" (with all the othe
11/03/2015 07:20 AM
Subject:Re: Master build fails ?
Hi Ted,
thanks for the update. The build with sbt is in progress on my box.
Regards
JB
On 11/03/2015 03:31 PM, Ted Yu wrote:
> Interesting, Sbt builds were not all failing:
>
> https://amplab.cs.berkeley.edu/jenkins/job/Spark-
Hi,
It appears it's time to switch to my lovely sbt then!
Pozdrawiam,
Jacek
--
Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
Follow me at https://twitter.com/jaceklaskowski
Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
On Tue, Nov 3, 2015 at 2:58 PM
Hi Ted,
thanks for the update. The build with sbt is in progress on my box.
Regards
JB
On 11/03/2015 03:31 PM, Ted Yu wrote:
Interesting, Sbt builds were not all failing:
https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/
FYI
On Tue, Nov 3, 2015 at 5:58 AM, Jean-Baptiste Onofré ma
Interesting, Sbt builds were not all failing:
https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/
FYI
On Tue, Nov 3, 2015 at 5:58 AM, Jean-Baptiste Onofré
wrote:
> Hi Jacek,
>
> it works fine with mvn: the problem is with sbt.
>
> I suspect a different reactor order in sbt compare to
Hi Jacek,
it works fine with mvn: the problem is with sbt.
I suspect a different reactor order in sbt compare to mvn.
Regards
JB
On 11/03/2015 02:44 PM, Jacek Laskowski wrote:
Hi,
Just built the sources using the following command and it worked fine.
➜ spark git:(master) ✗ ./build/mvn -Pya
Hi,
Just built the sources using the following command and it worked fine.
➜ spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
-Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
-DskipTests clean install
...
[INFO] --
Thanks for the update, I used mvn to build but without hive profile.
Let me try with mvn with the same options as you and sbt also.
I keep you posted.
Regards
JB
On 11/03/2015 12:55 PM, Jeff Zhang wrote:
I found it is due to SPARK-11073.
Here's the command I used to build
build/sbt clean co
Yeah, I also met this problem, just curious why jenkins test is OK.
On Tue, Nov 3, 2015 at 7:55 PM, Jeff Zhang wrote:
> I found it is due to SPARK-11073.
>
> Here's the command I used to build
>
> build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
> -Psparkr
>
> On Tue, Nov 3
I found it is due to SPARK-11073.
Here's the command I used to build
build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
-Psparkr
On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré
wrote:
> Hi Jeff,
>
> it works for me (with skipping the tests).
>
> Let me try again, just
Hi Jeff,
it works for me (with skipping the tests).
Let me try again, just to be sure.
Regards
JB
On 11/03/2015 11:50 AM, Jeff Zhang wrote:
Looks like it's due to guava version conflicts, I see both guava 14.0.1
and 16.0.1 under lib_managed/bundles. Anyone meet this issue too ?
[error]
/User
Looks like it's due to guava version conflicts, I see both guava 14.0.1 and
16.0.1 under lib_managed/bundles. Anyone meet this issue too ?
[error]
/Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
object HashCodes is not a member of package com.google
On my MacBook with 2.6 GHz Intel i7 CPU, I run zinc.
Here is the tail of mvn build output:
[INFO] Spark Project External Flume .. SUCCESS [7.368s]
[INFO] Spark Project External ZeroMQ . SUCCESS [9.153s]
[INFO] Spark Project External MQTT ...
Ah, found it:
https://github.com/apache/spark/blob/master/docs/building-spark.md#building-with-sbt
This version of the docs should be published once 1.2.0 is released.
Nick
On Tue, Nov 4, 2014 at 8:53 PM, Alessandro Baretta
wrote:
> Nicholas,
>
> Indeed, I was trying to use sbt to speed up the
Nicholas,
Indeed, I was trying to use sbt to speed up the build. My initial
experiments with the maven process took over 50 minutes, which on a 4-core
2014 MacBookPro seems obscene. Then again, after the failed attempt with
sbt, mvn clean package took only 13 minutes, leading me to think that most
Zinc, I believe, is something you can install and run to speed up your
Maven builds. It's not required.
I get a bunch of warnings when compiling with Maven, too. Dunno if they are
expected or not, but things work fine from there on.
Many people do indeed use sbt. I don't know where we have docume
Nicholas,
Yes, I saw them, but they refer to maven, and I'm under the impression that
sbt is the preferred way of building spark. Is indeed maven the "right
way"? Anyway, as per your advice I ctrl-d'ed my sbt shell and have ran `mvn
-DskipTests clean package`, which completed successfully. So, ind
I have seen this on sbt sometimes. I usually do an sbt clean and that fixes it.
Thanks,
Hari
On Tue, Nov 4, 2014 at 3:13 PM, Nicholas Chammas
wrote:
> FWIW, the "official" build instructions are here:
> https://github.com/apache/spark#building-spark
> On Tue, Nov 4, 2014 at 5:11 PM, Ted Yu wr
FWIW, the "official" build instructions are here:
https://github.com/apache/spark#building-spark
On Tue, Nov 4, 2014 at 5:11 PM, Ted Yu wrote:
> I built based on this commit today and the build was successful.
>
> What command did you use ?
>
> Cheers
>
> On Tue, Nov 4, 2014 at 2:08 PM, Alessand
I built based on this commit today and the build was successful.
What command did you use ?
Cheers
On Tue, Nov 4, 2014 at 2:08 PM, Alessandro Baretta
wrote:
> Fellow Sparkers,
>
> I am new here and still trying to learn to crawl. Please, bear with me.
>
> I just pulled f90ad5d from https://git
Fellow Sparkers,
I am new here and still trying to learn to crawl. Please, bear with me.
I just pulled f90ad5d from https://github.com/apache/spark.git and am
running the compile command in the sbt shell. This is the error I'm seeing:
[error]
/home/alex/git/spark/mllib/src/main/scala/org/apache/
44 matches
Mail list logo