The PermGen space error is controlled with MaxPermSize parameter. I run
with this in my pom, I think copied pretty literally from Spark's own
tests... I don't know what the sbt equivalent is but you should be able to
pass it...possibly via SBT_OPTS?


 <plugin>
              <groupId>org.scalatest</groupId>
              <artifactId>scalatest-maven-plugin</artifactId>
              <version>1.0</version>
              <configuration>

<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
                  <parallel>false</parallel>
                  <junitxml>.</junitxml>
                  <filereports>SparkTestSuite.txt</filereports>
                  <argLine>-Xmx3g -XX:MaxPermSize=256m
-XX:ReservedCodeCacheSize=512m</argLine>
                  <stderr/>
                  <systemProperties>
                      <java.awt.headless>true</java.awt.headless>
                      <spark.testing>1</spark.testing>
                      <spark.ui.enabled>false</spark.ui.enabled>

<spark.driver.allowMultipleContexts>true</spark.driver.allowMultipleContexts>
                  </systemProperties>
              </configuration>
              <executions>
                  <execution>
                      <id>test</id>
                      <goals>
                          <goal>test</goal>
                      </goals>
                  </execution>
              </executions>
          </plugin>
      </plugins>


On Tue, Aug 25, 2015 at 2:10 PM, Mike Trienis <mike.trie...@orcsol.com>
wrote:

> Hello,
>
> I am using sbt and created a unit test where I create a `HiveContext` and
> execute some query and then return. Each time I run the unit test the JVM
> will increase it's memory usage until I get the error:
>
> Internal error when running tests: java.lang.OutOfMemoryError: PermGen
> space
> Exception in thread "Thread-2" java.io.EOFException
>
> As a work-around, I can fork a new JVM each time I run the unit test,
> however, it seems like a bad solution as takes a while to run the unit
> test.
>
> By the way, I tried to importing the TestHiveContext:
>
>    - import org.apache.spark.sql.hive.test.TestHiveContext
>
> However, it suffers from the same memory issue. Has anyone else suffered
> from the same problem? Note that I am running these unit tests on my mac.
>
> Cheers, Mike.
>
>

Reply via email to