Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-21 Thread shashank agarwal
For now, i have solved this issue by adding the following in filink config : classloader.resolve-order: parent-first So it will ignore the duplicate classes from uber jar. I will work on the dependencies. One quick question I am using SBT for the building. Do you have any example sbt file for dep

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-20 Thread Timo Walther
Libraries such as CEP or Table API should have the "compile" scope and should be in the both the fat and non-fat jar. The non-fat jar should contain everything that is not in flink-dist or your lib directory. Regards, Timo Am 12/20/17 um 3:07 PM schrieb shashank agarwal: Hi, In that case,

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-20 Thread shashank agarwal
Hi, In that case, it won't find the dependencies. Cause I have other dependencies also and what about CEP etc. cause that is not part of flink-dist. Best Shashank ‌ On Wed, Dec 20, 2017 at 3:16 PM, Aljoscha Krettek wrote: > Hi, > > That jar file looks like it has too much stuff in there tha

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-20 Thread Aljoscha Krettek
Hi, That jar file looks like it has too much stuff in there that shouldn't be there. This can explain the errors you seeing because of classloading conflicts. Could you try not building a fat-jar and have only your code in your jar? Best, Aljoscha > On 20. Dec 2017, at 10:15, shashank agarwal

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-20 Thread shashank agarwal
One more thing when i submit the job ir start yarn session it prints following logs : Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-20 Thread Aljoscha Krettek
Hi, Could you please list what exactly is in your submitted jar file, for example using "jar tf my-jar-file.jar"? And also what files exactly are in your Flink lib directory. Best, Aljoscha > On 19. Dec 2017, at 20:10, shashank agarwal wrote: > > Hi Timo, > > I am using Rocksdbstatebackend

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-19 Thread shashank agarwal
Hi Timo, I am using Rocksdbstatebackend with hdfs path. I have following flink dependencies in my sbt : "org.slf4j" % "slf4j-log4j12" % "1.7.21", "org.apache.flink" %% "flink-scala" % flinkVersion % "provided", "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided", "org.

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-19 Thread shashank agarwal
I have tried to add this in Both lib folder of flink and assembly jar as dependency too. But getting the same error. ‌ On Tue, Dec 19, 2017 at 11:28 PM, Jörn Franke wrote: > You need to put flink-hadoop-compability*.jar in the lib folder of your > flink distribution or in the class path of yo

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-19 Thread Jörn Franke
You need to put flink-hadoop-compability*.jar in the lib folder of your flink distribution or in the class path of your Custer nodes > On 19. Dec 2017, at 12:38, shashank agarwal wrote: > > yes, it's working fine. now not getting compile time error. > > But when i trying to run this on cluster

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-19 Thread Timo Walther
Hi Shashank, it seems that HDFS is still not in classpath. Could you quickly explain how I can reproduce the error? Regards, Timo Am 12/19/17 um 12:38 PM schrieb shashank agarwal: yes, it's working fine. now not getting compile time error. But when i trying to run this on cluster or yarn,

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-19 Thread shashank agarwal
yes, it's working fine. now not getting compile time error. But when i trying to run this on cluster or yarn, getting following runtime error : org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-08 Thread shashank agarwal
Sure i’ll Try that. Thanks On Fri, 8 Dec 2017 at 9:18 PM, Stephan Ewen wrote: > I would recommend to add "flink-shaded-hadoop2". That is a bundle of all > Hadoop dependencies used by Flink. > > > On Fri, Dec 8, 2017 at 3:44 PM, Aljoscha Krettek > wrote: > >> I see, thanks for letting us know! >

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-08 Thread Stephan Ewen
I would recommend to add "flink-shaded-hadoop2". That is a bundle of all Hadoop dependencies used by Flink. On Fri, Dec 8, 2017 at 3:44 PM, Aljoscha Krettek wrote: > I see, thanks for letting us know! > > > On 8. Dec 2017, at 15:42, shashank agarwal wrote: > > I had to include two dependencies

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-08 Thread Aljoscha Krettek
I see, thanks for letting us know! > On 8. Dec 2017, at 15:42, shashank agarwal wrote: > > I had to include two dependencies. > > hadoop-hdfs (this for HDFS configuration) > hadoop-common (this for Path) > > > ‌ > > On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-08 Thread shashank agarwal
I had to include two dependencies. hadoop-hdfs (this for HDFS configuration) hadoop-common (this for Path) ‌ On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek wrote: > I think hadoop-hdfs might be sufficient. > > > On 8. Dec 2017, at 14:48, shashank agarwal wrote: > > Can you specifically gui

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-08 Thread Aljoscha Krettek
I think hadoop-hdfs might be sufficient. > On 8. Dec 2017, at 14:48, shashank agarwal wrote: > > Can you specifically guide which dependencies should I add to extend this : > > https://github.com/apache/flink/blob/release-1.4.0-rc3/flink-connectors/flink-connector-filesystem/src/main/java/org/a

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-08 Thread shashank agarwal
Can you specifically guide which dependencies should I add to extend this : https://github.com/apache/flink/blob/release-1.4.0-rc3/flink-connectors/flink-connector-filesystem/src/main/java/org/apache/flink/streaming/connectors/fs/Bucketer.java is the Hadoop-core is sufficient. ‌ On Fri, Dec 8,

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-08 Thread shashank agarwal
It's a compilation error. I think I have to include the Hadoop dependencies. ‌ On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek wrote: > Hi, > > Is this a compilation error or at runtime. But in general, yes you have to > include the Hadoop dependencies if they're not there. > > Best, > Aljos

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-08 Thread Aljoscha Krettek
Hi, Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there. Best, Aljoscha > On 8. Dec 2017, at 14:10, shashank agarwal wrote: > > Hello, > > I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. > A

Can not resolve org.apache.hadoop.fs.Path in 1.4.0

2017-12-08 Thread shashank agarwal
Hello, I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink. I am extending org.apache.flink.streaming.connectors.fs.bucketing.Bucketer in the class, i have to use org.apache.hadoop.fs.Path but as hadoop libra