Github user liancheng commented on the pull request:
https://github.com/apache/spark/pull/3103#issuecomment-61759135
Hey @scwf, I'm also working on this. As you've mentioned, the cause of this
error is that somehow the assembly jar built by Maven misses a legitimate
MANIFEST.MF, while HiveThriftServer2 relies on the `Implementation-Version`
field in the manifest to inspect Spark version.
Of course, the MANIFEST.MF issue should be fixed. On the other hand, while
writing the [version inspection
code](https://github.com/apache/spark/blob/515abb9afa2d6b58947af6bb079a493b49d315ca/core/src/main/scala/org/apache/spark/util/Utils.scala#L1777-L1783),
I failed to notice the existing `SparkContext.version` method, which simply
returns a correct version string. So I'm going to resort to
`SparkContext.version` instead of inspecting MANIFEST.MF. I realized that
relying on MANIFEST.MF have two drawbacks:
1. The assembly/shading tricks makes MANIFEST.MF not so reliable (but we
need to fix this, and I'm not quite sure whether this PR is enough)
2. It makes testing harder, because you have to build the assembly jar
first and disable `SPARK_PREPEND_CLASSES`. Otherwise classes are loaded from
separate `.class` files rather than the assembly jar and no MANIFEST.MF can be
found.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]