I have no very strong love for the Hadoop 2.0.0-alpha version, and it seems
that most users go through YARN anyways.

Just to understand: The solution would be to not share protobuf in the fat
Hadoop jar at all? Is that not a problem for other situations, like users
with an earlier protobuf version?

On Thu, Feb 26, 2015 at 5:57 PM, Robert Metzger <rmetz...@apache.org> wrote:

> Hi,
>
> I'm currently working on https://issues.apache.org/jira/browse/FLINK-1605
> and its a hell of a mess.
>
> I got almost everything working, except for the hadoop 2.0.0-alpha profile.
> The profile exists because google protobuf has a different version in that
> Hadoop release.
> Since maven is setting the version of protobuf for the entire project to
> the older version, we have to use an older akka version which is causing
> issues.
>
> The logical conclusion from that would be shading Hadoop's protobuf version
> into the Hadoop jars. That by itself is working, however its not working
> for the "flink-yarn-tests".
>
> I think I can also solve the issue with the flink-yarn-tests, but it would
> be a very dirty hack (either injecting shaded code into the failsafe
> tests-classpath or putting test code into src/main).
>
> But the general question remains: Are we willing to continue spending a lot
> of time on maintaining the profile?
> Till has spend a lot of time recently to fix failing testcases for that old
> akka version, I spend almost two days now on getting the
> shading/dependencies right, and I'm sure we'll keep having troubles with
> the profile.
>
>
> Therefore, I was wondering if this is the right time to drop support for
> CDH4 / Hadoop 2.0.0-alpha.
>
>
> Best,
> Robert
>

Reply via email to