Github user jfrazee commented on the issue:
https://github.com/apache/nifi/pull/2293
@baank So, we still wouldn't need to update the Hadoop version on
everything yet, because in principle you can just do a build of NiFi overriding
the hadoop.version property and use 2.8.x. For example:
```sh
$ mvn -T 2.0C clean install -Dhadoop.version=2.8.2
-Dhadoop.guava.version=12.0.1 -Dhadoop.http.client.version=4.5.2
-Dhadoop.http.core.version=4.4.4 -DskipTests
```
That said, this is a little bit of a lie because in later versions of
HttpComponents HttpClient and HttpCore aren't versioned identically and we
currently only use a single property hadoop.http.client.version for these;
i.e., the hadoop.http.core.version property doesn't exist yet. See
[NIFI-4650](https://issues.apache.org/jira/browse/NIFI-4650) though.
So, I did the build above with the new property and tested with the
following jars and things seem to work:
```
aws-java-sdk-core-1.11.68.jar
aws-java-sdk-kms-1.11.68.jar
aws-java-sdk-s3-1.11.68.jar
hadoop-aws-2.8.2.jar
hadoop-common-2.8.2.jar
httpclient-4.5.2.jar
httpcore-4.4.4.jar
jackson-annotations-2.6.0.jar
jackson-core-2.6.1.jar
jackson-databind-2.6.1.jar
joda-time-2.8.2.jar
```
We're trying to be very cautious about updating the default to the next
major version of Hadoop so it might be best to stick with this still being a
property override.
---