Fantastic. As it happens, I just fixed up Mahout's tests for Java 8
and observed a lot of the same type of failure.

I'm about to submit PRs for the two issues I identified. AFAICT these
3 then cover the failures I mentioned:

https://issues.apache.org/jira/browse/SPARK-3329
https://issues.apache.org/jira/browse/SPARK-3330
https://issues.apache.org/jira/browse/SPARK-3331

I'd argue that none necessarily block a release, since they just
represent a problem with test-only code in Java 8, with the test-only
context of Jenkins and multiple profiles, and with a trivial
configuration in a style check for Python. Should be fixed but none
indicate a bug in the release.

On Sun, Aug 31, 2014 at 6:11 PM, Will Benton <wi...@redhat.com> wrote:
> ----- Original Message -----
>
>> dev/run-tests fails two tests (1 Hive, 1 Kafka Streaming) for me
>> locally on 1.1.0-rc3. Does anyone else see that? It may be my env.
>> Although I still see the Hive failure on Debian too:
>>
>> [info] - SET commands semantics for a HiveContext *** FAILED ***
>> [info]   Expected Array("spark.sql.key.usedfortestonly=test.val.0",
>> "spark.sql.key.usedfortestonlyspark.sql.key.usedfortestonly=test.val.0test.val.0"),
>> but got
>> Array("spark.sql.key.usedfortestonlyspark.sql.key.usedfortestonly=test.val.0test.val.0",
>> "spark.sql.key.usedfortestonly=test.val.0") (HiveQuerySuite.scala:541)
>
> I've seen this error before.  (In particular, I've seen it on my OS X machine 
> using Oracle JDK 8 but not on Fedora using OpenJDK.)  I've also seen similar 
> errors in topic branches (but not on master) that seem to indicate that tests 
> depend on sets of pairs arriving from Hive in a particular order; it seems 
> that this isn't a safe assumption.
>
> I just submitted a (trivial) PR to fix this spurious failure:  
> https://github.com/apache/spark/pull/2220
>
>
> best,
> wb

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to