zentol commented on a change in pull request #11983:
URL: https://github.com/apache/flink/pull/11983#discussion_r420172348



##########
File path: flink-connectors/flink-connector-hive/pom.xml
##########
@@ -126,12 +211,18 @@ under the License.
                        thus override the default hadoop version from 2.4.1 to 
2.7.5
                -->
                <dependency>
-                       <groupId>org.apache.flink</groupId>
-                       <artifactId>flink-shaded-hadoop-2-uber</artifactId>
-                       
<version>${hivemetastore.hadoop.version}-${flink.shaded.version}</version>
+                       <groupId>org.apache.hadoop</groupId>
+                       <artifactId>hadoop-common</artifactId>
+                       <version>${hivemetastore.hadoop.version}</version>

Review comment:
       redundant due to dependency management

##########
File path: flink-connectors/flink-connector-hive/pom.xml
##########
@@ -126,12 +211,18 @@ under the License.
                        thus override the default hadoop version from 2.4.1 to 
2.7.5
                -->
                <dependency>
-                       <groupId>org.apache.flink</groupId>
-                       <artifactId>flink-shaded-hadoop-2-uber</artifactId>
-                       
<version>${hivemetastore.hadoop.version}-${flink.shaded.version}</version>
+                       <groupId>org.apache.hadoop</groupId>
+                       <artifactId>hadoop-common</artifactId>
+                       <version>${hivemetastore.hadoop.version}</version>
                        <scope>provided</scope>
                </dependency>
 
+               <dependency>
+                       <groupId>org.apache.hadoop</groupId>
+                       <artifactId>hadoop-mapreduce-client-core</artifactId>
+                       <version>${hivemetastore.hadoop.version}</version>

Review comment:
       redundant due to dependency management

##########
File path: flink-formats/flink-orc-nohive/pom.xml
##########
@@ -98,6 +103,22 @@ under the License.
 
        </dependencies>
 
+       <profiles>
+               <profile>
+                       <!-- This profile adds dependencies needed to execute 
the tests
+                       with Hadoop 3 -->
+                       <id>hadoop3-tests</id>
+                       <dependencies>
+                               <dependency>
+                                       <groupId>org.apache.hadoop</groupId>
+                                       
<artifactId>hadoop-hdfs-client</artifactId>

Review comment:
       how come this isn't required at runtime?

##########
File path: flink-filesystems/flink-s3-fs-hadoop/pom.xml
##########
@@ -32,6 +32,17 @@ under the License.
 
        <packaging>jar</packaging>
 
+       <!-- Override the flink-parent dependencyManagement defintion for 
hadoop-common to ensure

Review comment:
       ```suggestion
        <!-- Override the flink-parent dependencyManagement definition for 
hadoop-common to ensure
   ```

##########
File path: 
flink-end-to-end-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/SQLClientKafkaITCase.java
##########
@@ -106,11 +112,16 @@ public SQLClientKafkaITCase(String kafkaVersion, String 
kafkaSQLVersion, String
        }
 
        @Before
-       public void before() {
+       public void before() throws Exception {
+               downloadCache.before();
                Path tmpPath = tmp.getRoot().toPath();
                LOG.info("The current temporary path: {}", tmpPath);
                this.sqlClientSessionConf = 
tmpPath.resolve("sql-client-session.conf");
                this.result = tmpPath.resolve("result");
+
+               
apacheAvroJars.add(downloadCache.getOrDownload("https://repo1.maven.org/maven2/org/apache/avro/avro/1.8.2/avro-1.8.2.jar";,
 tmpPath));

Review comment:
       So this only was only working by chance since missing stuff was provided 
by flink-shaded-hadoop? If so, why does that no longer work?
   Related (and likely the correct fix): 
https://issues.apache.org/jira/browse/FLINK-17417

##########
File path: 
flink-end-to-end-tests/flink-end-to-end-tests-common/src/main/java/org/apache/flink/tests/util/flink/SQLJobSubmission.java
##########
@@ -90,6 +91,11 @@ public SQLJobSubmissionBuilder addJar(Path jarFile) {
                        return this;
                }
 
+               public SQLJobSubmissionBuilder addJars(List<Path> jarFiles) {
+                       this.jars.addAll(jarFiles.stream().map(path -> 
path.toAbsolutePath().toString()).collect(Collectors.toList()));

Review comment:
       ```suggestion
                        jarFiles.forEach(this::addJar);
   ```

##########
File path: 
flink-connectors/flink-hbase/src/test/java/org/apache/flink/addons/hbase/util/HBaseTestingClusterAutoStarter.java
##########
@@ -142,6 +144,11 @@ private static Configuration initialize(Configuration 
conf) {
 
        @BeforeClass
        public static void setUp() throws Exception {
+               // HBase 1.4 does not work with Hadoop 3
+               // because it uses Guava 12.0.1, Hadoop 3 uses Guava 27.0-jre.
+               // There is not Guava version in between that works with both.
+               Assume.assumeTrue("This test is skipped for Hadoop versions 
above 3", VersionUtil.compareVersions(System.getProperty("hadoop.version"), 
"3.0.0") <= 0);

Review comment:
       ```suggestion
                Assume.assumeTrue("This test is skipped for Hadoop versions 
above 3", VersionUtil.compareVersions(System.getProperty("hadoop.version"), 
"3.0.0") < 0);
   ```
   ?

##########
File path: 
flink-yarn-tests/src/test/java/org/apache/flink/yarn/YarnTestBase.java
##########
@@ -636,8 +652,6 @@ private static void start(YarnConfiguration conf, String 
principal, String keyta
                Assert.assertNotNull("Flink uberjar not found", flinkUberjar);
                String flinkDistRootDir = 
flinkUberjar.getParentFile().getParent();
                flinkLibFolder = flinkUberjar.getParentFile(); // the uberjar 
is located in lib/
-               // the hadoop jar was copied into the target/shaded-hadoop 
directory during the build

Review comment:
       the pom should contain something to remove

##########
File path: flink-yarn/pom.xml
##########
@@ -227,6 +227,27 @@ under the License.
                                        </execution>
                                </executions>
                        </plugin>
+                       <!-- Write classpath of flink-yarn to a file, so that 
the yarn tests can use it as their classpath

Review comment:
       why aren't we using the classpath of flink-yarn-tests?

##########
File path: 
flink-yarn/src/test/java/org/apache/flink/yarn/YarnFileStageTestS3ITCase.java
##########
@@ -174,6 +176,9 @@ private void testRecursiveUploadForYarn(String scheme, 
String pathSuffix) throws
        @Test
        @RetryOnFailure(times = 3)
        public void testRecursiveUploadForYarnS3n() throws Exception {
+               // skip test on Hadoop 3: 
https://issues.apache.org/jira/browse/HADOOP-14738
+               Assume.assumeTrue("This test is skipped for Hadoop versions 
above 3", VersionUtil.compareVersions(System.getProperty("hadoop.version"), 
"3.0.0") <= 0);

Review comment:
       ```suggestion
                Assume.assumeTrue("This test is skipped for Hadoop versions 
above 3", VersionUtil.compareVersions(System.getProperty("hadoop.version"), 
"3.0.0") < 0);
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to