Just FYI, thought this might be helpful, I'm refactoring Hive Thrift server
test suites. These suites also fork new processes and suffer similar
issues. Stdout and stderr of forked processes are logged in the new version
of test suites with utilities under scala.sys.process package
https://github.c
hi,Cheng Lian
thanks, printing stdout/stderr of the forked process is more reasonable.
On 2014/8/19 13:35, Cheng Lian wrote:
The exception indicates that the forked process doesn’t executed as expected,
thus the test case /should/ fail.
Instead of replacing the exception with a |logWarning|,
The exception indicates that the forked process doesn’t executed as
expected, thus the test case *should* fail.
Instead of replacing the exception with a logWarning, capturing and
printing stdout/stderr of the forked process can be helpful for diagnosis.
Currently the only information we have at h
hi, all
I notice that jenkins may also throw this error when running
tests(https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18688/consoleFull).
This is because in Utils.executeAndGetOutput our progress exitCode is not 0,
may be we should logWarning here rather than throw a
env: ubuntu 14.04 + spark master buranch
mvn -Pyarn -Phive -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package
mvn -Pyarn -Phadoop-2.4 -Phive test
test error:
DriverSuite:
Spark assembly has been built with Hive, including Datanucleus jars on classpath
- driver should exit after fini