Unfortunately, it's the same for me recently. Not only that, but I also hit
MetaspaceSize OOM, too.
I ended up with MAVEN_OPTS like the following.

-Xms12g -Xmx12g -Xss128M -XX:MaxMetaspaceSize=4g ...

Dongjoon.


On Mon, Sep 27, 2021 at 12:18 PM Sean Owen <sro...@gmail.com> wrote:

> Has anyone seen a StackOverflowError when running tests? It happens in
> compilation. I heard from another user who hit this earlier, and I had not,
> until just today testing this:
>
> [ERROR] ## Exception when compiling 495 sources to
> /mnt/data/testing/spark-3.2.0/sql/catalyst/target/scala-2.12/classes
> java.lang.StackOverflowError
>
> scala.tools.nsc.transform.TypingTransformers$TypingTransformer.atOwner(TypingTransformers.scala:38)
> scala.reflect.internal.Trees.itransform(Trees.scala:1420)
> scala.reflect.internal.Trees.itransform$(Trees.scala:1400)
> scala.reflect.internal.SymbolTable.itransform(SymbolTable.scala:28)
> ...
>
> Upping the JVM thread stack size to, say, 16m from 4m in the pom.xml file
> made it work. I presume this could be somehow env-specific, as clearly the
> CI/CD tests and release process built successfully. Just checking if it's
> "just me".
>
>
> On Mon, Sep 27, 2021 at 7:56 AM Gengliang Wang <ltn...@gmail.com> wrote:
>
>> Please vote on releasing the following candidate as
>> Apache Spark version 3.2.0.
>>
>> The vote is open until 11:59pm Pacific time September 29 and passes if a
>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.2.0
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v3.2.0-rc5 (commit
>> 49aea14c5afd93ae1b9d19b661cc273a557853f5):
>> https://github.com/apache/spark/tree/v3.2.0-rc5
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1392
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/
>>
>> The list of bug fixes going into 3.2.0 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12349407
>>
>> This release is using the release script of the tag v3.2.0-rc5.
>>
>>
>> FAQ
>>
>> =========================
>> How can I help test this release?
>> =========================
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===========================================
>> What should happen to JIRA tickets still targeting 3.2.0?
>> ===========================================
>> The current list of open tickets targeted at 3.2.0 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>> Version/s" = 3.2.0
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==================
>> But my bug isn't fixed?
>> ==================
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>

Reply via email to