hi, all I notice that jenkins may also throw this error when running tests(https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18688/consoleFull).
This is because in Utils.executeAndGetOutput our progress exitCode is not 0, may be we should logWarning here rather than throw a exception? Utils.executeAndGetOutput { val exitCode = process.waitFor() stdoutThread.join() // Wait for it to finish reading output if (exitCode != 0) { throw new SparkException("Process " + command + " exited with code " + exitCode) } } any idea? On 2014/8/15 11:01, scwf wrote:
env: ubuntu 14.04 + spark master buranch mvn -Pyarn -Phive -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package mvn -Pyarn -Phadoop-2.4 -Phive test test error: DriverSuite: Spark assembly has been built with Hive, including Datanucleus jars on classpath - driver should exit after finishing *** FAILED *** SparkException was thrown during property evaluation. (DriverSuite.scala:40) Message: Process List(./bin/spark-class, org.apache.spark.DriverWithoutCleanup, local) exited with code 1 Occurred at table row 0 (zero based, not counting headings), which had values ( master = local ) SparkSubmitSuite: Spark assembly has been built with Hive, including Datanucleus jars on classpath - launch simple application with spark-submit *** FAILED *** org.apache.spark.SparkException: Process List(./bin/spark-submit, --class, org.apache.spark.deploy.SimpleApplicationTest, --name, testApp, --master, local, file:/tmp/1408015655220-0/testJar-1408015655220.jar) exited with code 1 at org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:810) at org.apache.spark.deploy.SparkSubmitSuite.runSparkSubmit(SparkSubmitSuite.scala:311) at org.apache.spark.deploy.SparkSubmitSuite$$anonfun$14.apply$mcV$sp(SparkSubmitSuite.scala:291) at org.apache.spark.deploy.SparkSubmitSuite$$anonfun$14.apply(SparkSubmitSuite.scala:284) at org.apache.spark.deploy.SparkSubmitSuite$$anonfun$14.apply(SparkSubmitSuite.scala:284) at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22) at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22) at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85) at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) at org.scalatest.Transformer.apply(Transformer.scala:22) ... Spark assembly has been built with Hive, including Datanucleus jars on classpath - spark submit includes jars passed in through --jar *** FAILED *** org.apache.spark.SparkException: Process List(./bin/spark-submit, --class, org.apache.spark.deploy.JarCreationTest, --name, testApp, --master, local-cluster[2,1,512], --jars, file:/tmp/1408015659416-0/testJar-1408015659471.jar,fi le:/tmp/1408015659472-0/testJar-1408015659513.jar, file:/tmp/1408015659415-0/testJar-1408015659416.jar) exited with code 1 at org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:810) at org.apache.spark.deploy.SparkSubmitSuite.runSparkSubmit(SparkSubmitSuite.scala:311) at org.apache.spark.deploy.SparkSubmitSuite$$anonfun$15.apply$mcV$sp(SparkSubmitSuite.scala:305) at org.apache.spark.deploy.SparkSubmitSuite$$anonfun$15.apply(SparkSubmitSuite.scala:294) at org.apache.spark.deploy.SparkSubmitSuite$$anonfun$15.apply(SparkSubmitSuite.scala:294) at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22) at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22) at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85) at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) at org.scalatest.Transformer.apply(Transformer.scala:22) ... but only test the specific suite as follows will be ok: mvn -Pyarn -Phadoop-2.4 -Phive -DwildcardSuites=org.apache.spark.DriverSuite test it seems when run with "mvn -Pyarn -Phadoop-2.4 -Phive test",the process with Utils.executeAndGetOutput started can not exited successfully (exitcode is not zero) anyone has idea for this?
-- Best Regards Fei Wang -------------------------------------------------------------------------------- --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org