I built the spark with build/mvn via terminal. It also downloaded maven
with the right version. After changing the maven to the right version in
IntelliJ, everything worked perfectly again.

Thanks for the info. I will enhance that a bit later. Hope it helps the
community.

On Mon, 15 Apr 2019 at 1:57 AM, Sean Owen <sro...@gmail.com> wrote:

> That's right, Spark needs Maven 3.6.0. Just install it locally and
> then configure IntelliJ to use the local Maven, not built-in 3.3.9.
> The docs you're looking at are actually in
> github.com/apache/spark-website as they aren't version-specific and
> tied to a release.
>
> On Sun, Apr 14, 2019 at 12:00 PM William Wong <william1...@gmail.com>
> wrote:
> >
> > Hi Sean,
> >
> > I would like to open a PR for updating the documentation. However, i
> cannot share any file for 'http://spark.apache.org/developer-tools.html'
> It seems that this file is not a part of the documentation (under folder
> docs)...
> >
> > Thanks and regards,
> > William
> >
> > On Sun, Apr 14, 2019 at 11:58 PM William Wong <william1...@gmail.com>
> wrote:
> >>
> >> Hi Sean,
> >>
> >> I tried the button, but antlr4 source was not generate as expected. I
> checked the IntelliJ log and found some error message like:
> >>
> >> 2019-04-14 16:05:24,796 [ 314609]   INFO -
> #org.jetbrains.idea.maven - [WARNING] Rule 0:
> org.apache.maven.plugins.enforcer.RequireMavenVersion failed with message:
> >> Detected Maven Version: 3.3.9 is not in the allowed range 3.6.0.
> >> 2019-04-14 16:05:24,813 [ 314626]   INFO -
> #org.jetbrains.idea.maven -
> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute
> goal org.apache.maven.plugins:maven-enforcer-plugin:3.0.0-M2:enforce
> (enforce-versions) on project spark-parent_2.12: Some Enforcer rules have
> failed. Look above for specific messages explaining why the rule failed.
> >> java.lang.RuntimeException:
> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute
> goal org.apache.maven.plugins:maven-enforcer-plugin:3.0.0-M2:enforce
> (enforce-versions) on project spark-parent_2.12: Some Enforcer rules have
> failed. Look above for specific messages explaining why the rule failed.
> >>
> >>
> >> It seems that the source file generation process failed sliently due to
> incorrect maven version. My IntelliJ bundled maven 3.3.9 while Spark's
> master branch requires maven 3.6.0. Be honest, failing an action sliently
> should be an IntelliJ bug. But a note to guide spark developers to set the
> maven version for IntelliJ should be helpful. I just create a JIRA (
> https://issues.apache.org/jira/browse/SPARK-27458) for this.
> >>
> >> Thanks and regards,
> >> William
> >>
> >> On Sun, 14 Apr 2019 at 8:57 PM, Sean Owen <sro...@gmail.com> wrote:
> >>>
> >>> For IntelliJ, in the Maven pane, there's a button to generate all
> >>> sources and resources that the build creates. That's the easier
> >>> option. You can open a PR to add a note about it along with other docs
> >>> for IntelliJ users.
> >>>
> >>> On Sun, Apr 14, 2019 at 4:24 AM William Wong <william1...@gmail.com>
> wrote:
> >>> >
> >>> > Dear all,
> >>> >
> >>> > I tried to follow the guide at '
> http://spark.apache.org/developer-tools.html' to setup an IntelliJ
> project for Spark. However, the project was failed to build. It was due to
> missing classes generated via antlr on sql/catalyst project.
> >>> >
> >>> > I would like to enhance the document to hint other new joiners to
> run 'build/mvn antlr4 -f sql/catalyst/pom.xml' if hitting missing ANTLR4
> classes files. However, Spark's project structure is very new to me. I hope
> I did not miss any guideline on spark's documentation about this issue, if
> so pls let me know. Thanks in advance.
> >>> >
> >>> > Regards,
> >>> > William
> >>> >
> >>> >
>

Reply via email to