Github user liancheng commented on the pull request:
https://github.com/apache/spark/pull/2685#issuecomment-60937978
A missing piece from my last comment: while doing all the local tests I
mentioned above, I was following [this
change](https://github.com/marmbrus/spark/commit/8f6b09a7813cec22480a23f0301c5d5988090d02#commitcomment-8299290),
which builds assembly jar with `-Phive,hive-0.12.0` but runs tests with
`-Phive`. In this way, we can pass all Spark core tests.
But this PR tries to fix this inconsistency and removed the `hive-0.12.0`
profile while building assembly jar. Although we haven't figured out why, but
this breaks some Spark core test suites.
Building assembly jar with Hive 0.12.0 dependency while testing with 0.13.1
doesn't work for `HiveThriftServer2`, since `HiveThriftServer2Suite` and
`CliSuite` both relies on the assembly jar and spawn external processes for
testing purposes.
Basically, somehow #2241 breaks Spark core tests if compiled with Hive
0.13.1 dependencies. @scwf and I are investigating the reason.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]