If I remember correctly, similar/same errors happened with other hadoop
versions. I need to rebuild it with those and compare the logs.


On Tue, Jul 1, 2014 at 1:04 AM, Patrick Wendell <pwend...@gmail.com> wrote:

> Do those also happen if you run other hadoop versions (e.g. try 1.0.4)?
>
> On Tue, Jul 1, 2014 at 1:00 AM, Taka Shinagawa <taka.epsi...@gmail.com>
> wrote:
> > Since Spark 1.0.0, I've been seeing multiple errors when running sbt
> test.
> >
> > I ran the following commands from Spark 1.0.1 RC1 on Mac OSX 10.9.2.
> >
> > $ sbt/sbt clean
> > $ SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly
> > $ sbt/sbt test
> >
> >
> > I'm attaching the log file generated by the sbt test.
> >
> > Here's the summary part of the test.
> >
> > [info] Run completed in 30 minutes, 57 seconds.
> > [info] Total number of tests run: 605
> > [info] Suites: completed 83, aborted 0
> > [info] Tests: succeeded 600, failed 5, canceled 0, ignored 5, pending 0
> > [info] *** 5 TESTS FAILED ***
> > [error] Failed: Total 653, Failed 5, Errors 0, Passed 648, Ignored 5
> > [error] Failed tests:
> > [error] org.apache.spark.ShuffleNettySuite
> > [error] org.apache.spark.ShuffleSuite
> > [error] org.apache.spark.FileServerSuite
> > [error] org.apache.spark.DistributedSuite
> > [error] (core/test:test) sbt.TestsFailedException: Tests unsuccessful
> > [error] Total time: 2033 s, completed Jul 1, 2014 12:08:03 AM
> >
> > Is anyone else seeing errors like this?
> >
> >
> > Thanks,
> > Taka
>

Reply via email to