I've created a PR to add the error message guidelines to the Spark
contributing guide. Would appreciate some eyes on it!
https://github.com/apache/spark-website/pull/332
On Wed, Apr 14, 2021 at 5:34 PM Yuming Wang wrote:
> +1 LGTM.
>
> On Thu, Apr 15, 2021 at 1:50 AM Karen wrote:
>
>> That make
+1 LGTM.
On Thu, Apr 15, 2021 at 1:50 AM Karen wrote:
> That makes sense to me - given that an assert failure throws an
> AssertException, I would say that the same guidelines should apply for
> asserts.
>
> On Tue, Apr 13, 2021 at 7:41 PM Yuming Wang wrote:
>
>> Do we have plans to apply these
That makes sense to me - given that an assert failure throws an
AssertException, I would say that the same guidelines should apply for
asserts.
On Tue, Apr 13, 2021 at 7:41 PM Yuming Wang wrote:
> Do we have plans to apply these guidelines to assert? For example:
>
>
> https://github.com/apache/
Do we have plans to apply these guidelines to assert? For example:
https://github.com/apache/spark/blob/5b478416f8e3fe2f015af1b6c8faa7fe9f15c05d/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/orc/OrcUtils.scala#L136-L138
https://github.com/apache/spark/blob/053dd858d38e6107bc71
I would just go ahead and create a PR for that. Nothing written there looks
unreasonable.
But probably it should be best to wait a couple of days to make sure people
are happy with it.
2021년 4월 14일 (수) 오전 6:38, Karen 님이 작성:
> If the proposed guidelines look good, it would be useful to share these
If the proposed guidelines look good, it would be useful to share these
guidelines with the wider community. A good landing page for contributors
could be https://spark.apache.org/contributing.html. What do you think?
Thank you,
Karen Feng
On Wed, Apr 7, 2021 at 8:19 PM Hyukjin Kwon wrote:
> L
LGTM (I took a look, and had some offline discussions w/ some corrections
before it came out)
2021년 4월 8일 (목) 오전 5:28, Karen 님이 작성:
> Hi all,
>
> As discussed in SPIP: Standardize Exception Messages in Spark (
> https://docs.google.com/document/d/1XGj1o3xAFh8BA7RCn3DtwIPC6--hIFOaNUNSlpaOIZs/edit?
Hi all,
As discussed in SPIP: Standardize Exception Messages in Spark (
https://docs.google.com/document/d/1XGj1o3xAFh8BA7RCn3DtwIPC6--hIFOaNUNSlpaOIZs/edit?usp=sharing),
improving error message quality in Apache Spark involves establishing an
error message guideline for developers. Error message
Actually, there is a really trivial fix for that (an existing file not
being deleted when packaging). Opened SPARK-30489 for it.
On Fri, Jan 10, 2020 at 3:52 PM Jeff Evans
wrote:
> Thanks for the tip. Fixed by simply removing python/lib/pyspark.zip
> (since it's apparently generated), and rebu
Thanks for the tip. Fixed by simply removing python/lib/pyspark.zip (since
it's apparently generated), and rebuilding. I guess clean does not remove
it.
On Fri, Jan 10, 2020 at 3:50 PM Sean Owen wrote:
> Sounds like you might have some corrupted file locally. I don't see
> any of the automated
Sounds like you might have some corrupted file locally. I don't see
any of the automated test builders failing. Nuke your local assembly
build and try again?
On Fri, Jan 10, 2020 at 3:49 PM Jeff Evans
wrote:
>
> Greetings,
>
> I'm getting an error when building, on latest master (2bd873181 as of
Greetings,
I'm getting an error when building, on latest master (2bd873181 as of this
writing). Full build command I'm running is: ./build/mvn -DskipTests clean
package
[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-antrun-plugin:1.8:run (create-tmp-dir) on
project spark-assembly_
Hi,
Fixed now. git pull and start over.
https://github.com/apache/spark/commit/e1bd70f44b11141b000821e9754efeabc14f24a5
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowsk
I get this error when trying to build from Git master branch:
[ERROR] Failed to execute goal
net.alchim31.maven:scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) on
project spark-catalyst_2.11: MavenReportException: Error while creating
archive: wrap: Process exited with an error: 1 (Exit value:
Hi,
I have only encountered 'code too large' errors when changing grammars. I
am using SBT/Idea, no Eclipse.
The size of an ANTLR Parser/Lexer is dependent on the rules inside the
source grammar and the rules it depends on. So we should take a look at the
IdentifiersParser.g/ExpressionParser.g; t
Thanks for the pointer. It seems to be really a pathological case, since
the file that's in error is part of the splinter file (the smaller one,
IndetifiersParser). I'll see if I can work around by splitting it some more.
iulian
On Thu, Jan 28, 2016 at 4:43 PM, Ted Yu wrote:
> After this change
After this change:
[SPARK-12681] [SQL] split IdentifiersParser.g into two files
the biggest file under
sql/catalyst/src/main/antlr3/org/apache/spark/sql/catalyst/parser is
SparkSqlParser.g
Maybe split SparkSqlParser.g up as well ?
On Thu, Jan 28, 2016 at 5:21 AM, Iulian Dragoș
wrote:
> Hi,
Hi,
Has anyone seen this error?
The code of method specialStateTransition(int, IntStream) is exceeding
the 65535 bytes limitSparkSqlParser_IdentifiersParser.java:39907
The error is in ANTLR generated files and it’s (according to Stack
Overflow) due to state explosion in parser (or lexer). Th
>
>--
>View this message in context:
>http://apache-spark-developers-list.1001551.n3.nabble.com/latest-Spark-build-error-tp15782p15794.html
>Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
>
>
One more question:
Is there a way only to build the MLlib using command line?
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/latest-Spark-build-error-tp15782p15794.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com
Updating Maven version to 3.3.9 solved the issue
Thanks everyone!
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/latest-Spark-build-error-tp15782p15787.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com
This is because to build Spark requires maven 3.3.3 or later.
http://spark.apache.org/docs/latest/building-spark.html
Regards,
Kazuaki Ishizaki
From: salexln
To: dev@spark.apache.org
Date: 2015/12/25 15:52
Subject:latest Spark build error
Hi all,
I'm getting build
Hi all,
I'm getting build error when trying to build a clean version of latest
Spark. I did the following
1) git clone https://github.com/apache/spark.git
2) build/mvn -DskipTests clean package
But I get the following error:
Spark Project Parent POM .. FAILURE [2
; could not be resolved:
>
>
>> org.apache.spark:spark-network-common_2.10:jar:1.3.2-SNAPSHOT,
>> org.apache.spark:spark-network-shuffle_2.10:jar:1.3.2-SNAPSHOT: Could not
>> find artifact
>> org.apache.spark:spark-network-common_2.10:jar:1.3.2-SNAPSHOT
>> in
che.spark:spark-network-common_2.10:jar:1.3.2-SNAPSHOT
> in apache.snapshots (http://repository.apache.org/snapshots)
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-
rk-network-common_2.10:jar:1.3.2-SNAPSHOT,
> org.apache.spark:spark-network-shuffle_2.10:jar:1.3.2-SNAPSHOT: Could not
> find artifact org.apache.spark:spark-network-common_2.10:jar:1.3.2-SNAPSHOT
> in apache.snapshots (http://repository.apache.org/snapshots)
>
>
>
>
> --
> Vie
551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-tp11441p11449.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands
hen re-cloning from github and switching
> to the 1.2 or 1.3 branch.
>
> Does anything persist outside of the spark directory?
>
> Are you able to build either 1.2 or 1.3 w/ Scala-2.11?
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-lis
?
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-tp11441p11447.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
-
To
;> > could not be resolved:
>>> > org.apache.spark:spark-network-common_2.10:jar:1.2.3-SNAPSHOT,
>>> > org.apache.spark:spark-network-shuffle_2.10:jar:1.2.3-SNAPSHOT: Failure
>>> > to
>>> > find org.apache.spark:spark-ne
t; > find org.apache.spark:spark-network-common_2.10:jar:1.2.3-SNAPSHOT in
>> > http://repository.apache.org/snapshots was cached in the local
>> > repository,
>> > resolution will not be reattempted until the update interval of
>> > apache.snapshots has ela
cached in the local
> repository,
> > resolution will not be reattempted until the update interval of
> > apache.snapshots has elapsed or updates are forced
> >
> >
> >
> >
> > --
> > View this message in context: http://apache-spark-
> developers-li
not be reattempted until the update interval of
> apache.snapshots has elapsed or updates are forced
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scal
not be reattempted until the update interval of
apache.snapshots has elapsed or updates are forced
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-tp11441p11442.html
Sent from the Apache Spark Developers List mailing l
204)
at
org.sonatype.aether.impl.internal.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:427)
... 26 more
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-tp11441.html
Sent from the Apache Spark Develop
I am getting "The command is too long" error.
Is there anything which needs to be done.
However for the time being I followed the "sbt" way of buidling spark in
IntelliJ.
On Mon, Dec 29, 2014 at 3:52 AM, Sean Owen wrote:
> It means a test failed but you have not shown the test failure. This wou
It means a test failed but you have not shown the test failure. This would
have been logged earlier. You would need to say how you ran tests too. The
tests for 1.2.0 pass for me on several common permutations.
On Dec 29, 2014 3:22 AM, "Naveen Madhire" wrote:
> Hi,
>
> I am follow the below link f
Hi,
I am follow the below link for building Spark 1.2.0
https://spark.apache.org/docs/1.2.0/building-spark.html
I am getting the below error during the Maven build. I am using IntelliJ
IDE.
The build is failing in the scalatest plugin,
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent
Github user witgo commented on the pull request:
https://github.com/apache/spark/pull/25#issuecomment-37162150
Now do not use a proxy also has the same compiler error
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If yo
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/25#issuecomment-37161832
What is news there? You say your environment requires proxy settings and
you successfully identified them. Here you fail to set them.
---
If your project is set up for it,
40 matches
Mail list logo