Re: [DISCUSS] Build error message guideline

2021-04-15 Thread Karen
I've created a PR to add the error message guidelines to the Spark contributing guide. Would appreciate some eyes on it! https://github.com/apache/spark-website/pull/332 On Wed, Apr 14, 2021 at 5:34 PM Yuming Wang wrote: > +1 LGTM. > > On Thu, Apr 15, 2021 at 1:50 AM Karen wrote: > >> That make

Re: [DISCUSS] Build error message guideline

2021-04-14 Thread Yuming Wang
+1 LGTM. On Thu, Apr 15, 2021 at 1:50 AM Karen wrote: > That makes sense to me - given that an assert failure throws an > AssertException, I would say that the same guidelines should apply for > asserts. > > On Tue, Apr 13, 2021 at 7:41 PM Yuming Wang wrote: > >> Do we have plans to apply these

Re: [DISCUSS] Build error message guideline

2021-04-14 Thread Karen
That makes sense to me - given that an assert failure throws an AssertException, I would say that the same guidelines should apply for asserts. On Tue, Apr 13, 2021 at 7:41 PM Yuming Wang wrote: > Do we have plans to apply these guidelines to assert? For example: > > > https://github.com/apache/

Re: [DISCUSS] Build error message guideline

2021-04-13 Thread Yuming Wang
Do we have plans to apply these guidelines to assert? For example: https://github.com/apache/spark/blob/5b478416f8e3fe2f015af1b6c8faa7fe9f15c05d/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/orc/OrcUtils.scala#L136-L138 https://github.com/apache/spark/blob/053dd858d38e6107bc71

Re: [DISCUSS] Build error message guideline

2021-04-13 Thread Hyukjin Kwon
I would just go ahead and create a PR for that. Nothing written there looks unreasonable. But probably it should be best to wait a couple of days to make sure people are happy with it. 2021년 4월 14일 (수) 오전 6:38, Karen 님이 작성: > If the proposed guidelines look good, it would be useful to share these

Re: [DISCUSS] Build error message guideline

2021-04-13 Thread Karen
If the proposed guidelines look good, it would be useful to share these guidelines with the wider community. A good landing page for contributors could be https://spark.apache.org/contributing.html. What do you think? Thank you, Karen Feng On Wed, Apr 7, 2021 at 8:19 PM Hyukjin Kwon wrote: > L

Re: [DISCUSS] Build error message guideline

2021-04-07 Thread Hyukjin Kwon
LGTM (I took a look, and had some offline discussions w/ some corrections before it came out) 2021년 4월 8일 (목) 오전 5:28, Karen 님이 작성: > Hi all, > > As discussed in SPIP: Standardize Exception Messages in Spark ( > https://docs.google.com/document/d/1XGj1o3xAFh8BA7RCn3DtwIPC6--hIFOaNUNSlpaOIZs/edit?

[DISCUSS] Build error message guideline

2021-04-07 Thread Karen
Hi all, As discussed in SPIP: Standardize Exception Messages in Spark ( https://docs.google.com/document/d/1XGj1o3xAFh8BA7RCn3DtwIPC6--hIFOaNUNSlpaOIZs/edit?usp=sharing), improving error message quality in Apache Spark involves establishing an error message guideline for developers. Error message

Re: Build error: python/lib/pyspark.zip is not a ZIP archive

2020-01-10 Thread Jeff Evans
Actually, there is a really trivial fix for that (an existing file not being deleted when packaging). Opened SPARK-30489 for it. On Fri, Jan 10, 2020 at 3:52 PM Jeff Evans wrote: > Thanks for the tip. Fixed by simply removing python/lib/pyspark.zip > (since it's apparently generated), and rebu

Re: Build error: python/lib/pyspark.zip is not a ZIP archive

2020-01-10 Thread Jeff Evans
Thanks for the tip. Fixed by simply removing python/lib/pyspark.zip (since it's apparently generated), and rebuilding. I guess clean does not remove it. On Fri, Jan 10, 2020 at 3:50 PM Sean Owen wrote: > Sounds like you might have some corrupted file locally. I don't see > any of the automated

Re: Build error: python/lib/pyspark.zip is not a ZIP archive

2020-01-10 Thread Sean Owen
Sounds like you might have some corrupted file locally. I don't see any of the automated test builders failing. Nuke your local assembly build and try again? On Fri, Jan 10, 2020 at 3:49 PM Jeff Evans wrote: > > Greetings, > > I'm getting an error when building, on latest master (2bd873181 as of

Build error: python/lib/pyspark.zip is not a ZIP archive

2020-01-10 Thread Jeff Evans
Greetings, I'm getting an error when building, on latest master (2bd873181 as of this writing). Full build command I'm running is: ./build/mvn -DskipTests clean package [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.8:run (create-tmp-dir) on project spark-assembly_

Re: Build error

2016-07-22 Thread Jacek Laskowski
Hi, Fixed now. git pull and start over. https://github.com/apache/spark/commit/e1bd70f44b11141b000821e9754efeabc14f24a5 Pozdrawiam, Jacek Laskowski https://medium.com/@jaceklaskowski/ Mastering Apache Spark http://bit.ly/mastering-apache-spark Follow me at https://twitter.com/jaceklaskowsk

Build error

2016-07-22 Thread Mikael Ståldal
I get this error when trying to build from Git master branch: [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) on project spark-catalyst_2.11: MavenReportException: Error while creating archive: wrap: Process exited with an error: 1 (Exit value:

Re: build error: code too big: specialStateTransition(int, IntStream)

2016-01-28 Thread Herman van Hövell tot Westerflier
Hi, I have only encountered 'code too large' errors when changing grammars. I am using SBT/Idea, no Eclipse. The size of an ANTLR Parser/Lexer is dependent on the rules inside the source grammar and the rules it depends on. So we should take a look at the IdentifiersParser.g/ExpressionParser.g; t

Re: build error: code too big: specialStateTransition(int, IntStream)

2016-01-28 Thread Iulian Dragoș
Thanks for the pointer. It seems to be really a pathological case, since the file that's in error is part of the splinter file (the smaller one, IndetifiersParser). I'll see if I can work around by splitting it some more. iulian On Thu, Jan 28, 2016 at 4:43 PM, Ted Yu wrote: > After this change

Re: build error: code too big: specialStateTransition(int, IntStream)

2016-01-28 Thread Ted Yu
After this change: [SPARK-12681] [SQL] split IdentifiersParser.g into two files the biggest file under sql/catalyst/src/main/antlr3/org/apache/spark/sql/catalyst/parser is SparkSqlParser.g Maybe split SparkSqlParser.g up as well ? On Thu, Jan 28, 2016 at 5:21 AM, Iulian Dragoș wrote: > Hi,

build error: code too big: specialStateTransition(int, IntStream)

2016-01-28 Thread Iulian Dragoș
Hi, Has anyone seen this error? The code of method specialStateTransition(int, IntStream) is exceeding the 65535 bytes limitSparkSqlParser_IdentifiersParser.java:39907 The error is in ANTLR generated files and it’s (according to Stack Overflow) due to state explosion in parser (or lexer). Th

Re: latest Spark build error

2015-12-25 Thread Allen Zhang
> >-- >View this message in context: >http://apache-spark-developers-list.1001551.n3.nabble.com/latest-Spark-build-error-tp15782p15794.html >Sent from the Apache Spark Developers List mailing list archive at Nabble.com. > >

Re: latest Spark build error

2015-12-25 Thread salexln
One more question: Is there a way only to build the MLlib using command line? -- View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/latest-Spark-build-error-tp15782p15794.html Sent from the Apache Spark Developers List mailing list archive at Nabble.com

Re: latest Spark build error

2015-12-24 Thread salexln
Updating Maven version to 3.3.9 solved the issue Thanks everyone! -- View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/latest-Spark-build-error-tp15782p15787.html Sent from the Apache Spark Developers List mailing list archive at Nabble.com

Re: latest Spark build error

2015-12-24 Thread Kazuaki Ishizaki
This is because to build Spark requires maven 3.3.3 or later. http://spark.apache.org/docs/latest/building-spark.html Regards, Kazuaki Ishizaki From: salexln To: dev@spark.apache.org Date: 2015/12/25 15:52 Subject:latest Spark build error Hi all, I'm getting build

latest Spark build error

2015-12-24 Thread salexln
Hi all, I'm getting build error when trying to build a clean version of latest Spark. I did the following 1) git clone https://github.com/apache/spark.git 2) build/mvn -DskipTests clean package But I get the following error: Spark Project Parent POM .. FAILURE [2

Re: 1.3 Build Error with Scala-2.11

2015-04-07 Thread Marty Bower
; could not be resolved: > > >> org.apache.spark:spark-network-common_2.10:jar:1.3.2-SNAPSHOT, >> org.apache.spark:spark-network-shuffle_2.10:jar:1.3.2-SNAPSHOT: Could not >> find artifact >> org.apache.spark:spark-network-common_2.10:jar:1.3.2-SNAPSHOT >> in

Re: 1.3 Build Error with Scala-2.11

2015-04-07 Thread Imran Rashid
che.spark:spark-network-common_2.10:jar:1.3.2-SNAPSHOT > in apache.snapshots (http://repository.apache.org/snapshots) > > > > > -- > View this message in context: > http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-

Re: 1.3 Build Error with Scala-2.11

2015-04-06 Thread Patrick Wendell
rk-network-common_2.10:jar:1.3.2-SNAPSHOT, > org.apache.spark:spark-network-shuffle_2.10:jar:1.3.2-SNAPSHOT: Could not > find artifact org.apache.spark:spark-network-common_2.10:jar:1.3.2-SNAPSHOT > in apache.snapshots (http://repository.apache.org/snapshots) > > > > > -- > Vie

Re: 1.3 Build Error with Scala-2.11

2015-04-06 Thread mjhb
551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-tp11441p11449.html Sent from the Apache Spark Developers List mailing list archive at Nabble.com. - To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands

Re: 1.3 Build Error with Scala-2.11

2015-04-06 Thread Patrick Wendell
hen re-cloning from github and switching > to the 1.2 or 1.3 branch. > > Does anything persist outside of the spark directory? > > Are you able to build either 1.2 or 1.3 w/ Scala-2.11? > > > > -- > View this message in context: > http://apache-spark-developers-lis

Re: 1.3 Build Error with Scala-2.11

2015-04-06 Thread mjhb
? -- View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-tp11441p11447.html Sent from the Apache Spark Developers List mailing list archive at Nabble.com. - To

Re: 1.3 Build Error with Scala-2.11

2015-04-06 Thread Patrick Wendell
;> > could not be resolved: >>> > org.apache.spark:spark-network-common_2.10:jar:1.2.3-SNAPSHOT, >>> > org.apache.spark:spark-network-shuffle_2.10:jar:1.2.3-SNAPSHOT: Failure >>> > to >>> > find org.apache.spark:spark-ne

Re: 1.3 Build Error with Scala-2.11

2015-04-06 Thread Patrick Wendell
t; > find org.apache.spark:spark-network-common_2.10:jar:1.2.3-SNAPSHOT in >> > http://repository.apache.org/snapshots was cached in the local >> > repository, >> > resolution will not be reattempted until the update interval of >> > apache.snapshots has ela

Re: 1.3 Build Error with Scala-2.11

2015-04-06 Thread Marty Bower
cached in the local > repository, > > resolution will not be reattempted until the update interval of > > apache.snapshots has elapsed or updates are forced > > > > > > > > > > -- > > View this message in context: http://apache-spark- > developers-li

Re: 1.3 Build Error with Scala-2.11

2015-04-06 Thread Patrick Wendell
not be reattempted until the update interval of > apache.snapshots has elapsed or updates are forced > > > > > -- > View this message in context: > http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scal

Re: 1.3 Build Error with Scala-2.11

2015-04-06 Thread mjhb
not be reattempted until the update interval of apache.snapshots has elapsed or updates are forced -- View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-tp11441p11442.html Sent from the Apache Spark Developers List mailing l

1.3 Build Error with Scala-2.11

2015-04-06 Thread mjhb
204) at org.sonatype.aether.impl.internal.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:427) ... 26 more -- View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-tp11441.html Sent from the Apache Spark Develop

Re: Spark 1.2.0 build error

2014-12-29 Thread Naveen Madhire
I am getting "The command is too long" error. Is there anything which needs to be done. However for the time being I followed the "sbt" way of buidling spark in IntelliJ. On Mon, Dec 29, 2014 at 3:52 AM, Sean Owen wrote: > It means a test failed but you have not shown the test failure. This wou

Re: Spark 1.2.0 build error

2014-12-29 Thread Sean Owen
It means a test failed but you have not shown the test failure. This would have been logged earlier. You would need to say how you ran tests too. The tests for 1.2.0 pass for me on several common permutations. On Dec 29, 2014 3:22 AM, "Naveen Madhire" wrote: > Hi, > > I am follow the below link f

Spark 1.2.0 build error

2014-12-28 Thread Naveen Madhire
Hi, I am follow the below link for building Spark 1.2.0 https://spark.apache.org/docs/1.2.0/building-spark.html I am getting the below error during the Maven build. I am using IntelliJ IDE. The build is failing in the scalatest plugin, [INFO] Reactor Summary: [INFO] [INFO] Spark Project Parent

[GitHub] spark pull request: SPARK-1125: The maven build error for Spark Ex...

2014-03-10 Thread witgo
Github user witgo commented on the pull request: https://github.com/apache/spark/pull/25#issuecomment-37162150 Now do not use a proxy also has the same compiler error --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If yo

[GitHub] spark pull request: SPARK-1125: The maven build error for Spark Ex...

2014-03-10 Thread srowen
Github user srowen commented on the pull request: https://github.com/apache/spark/pull/25#issuecomment-37161832 What is news there? You say your environment requires proxy settings and you successfully identified them. Here you fail to set them. --- If your project is set up for it,