Re: flink 1.9 conflict jackson version

2020-04-07 Thread Fanbin Bu
Hi Aj, I got a work around to put my app jar inside /usr/lib/flink/lib directory. On Mon, Apr 6, 2020 at 11:27 PM aj wrote: > Hi Fanbin, > > I am facing a similar kind of issue. Let me know if you are able to > resolve this issue then please help me also > > > https://stackoverflow.com/question

Re: flink 1.9 conflict jackson version

2020-04-06 Thread aj
Hi Fanbin, I am facing a similar kind of issue. Let me know if you are able to resolve this issue then please help me also https://stackoverflow.com/questions/61012350/flink-reading-a-s3-file-causing-jackson-dependency-issue On Tue, Dec 17, 2019 at 7:50 AM ouywl wrote: > Hi Bu >I think I

Re: Flink 1.9 SQL Kafka Connector,Json format,how to deal with not json message?

2019-12-25 Thread Jark Wu
Hi LakeShen, I'm sorry there is no such configuration for json format currently. I think it makes sense to add such configuration like 'format.ignore-parse-errors' in csv format. I created FLINK-15396[1] to track this. Best, Jark [1]: https://issues.apache.org/jira/browse/FLINK-15396 On Thu, 26

Re: Flink 1.9 Sql Rowtime Error

2019-11-01 Thread OpenInx
Hi Polarisary. Checked the flink codebase and your stacktraces, seems you need to format the timestamp as : "-MM-dd'T'HH:mm:ss.SSS'Z'" The code is here: https://github.com/apache/flink/blob/38e4e2b8f9bc63a793a2bddef5a578e3f80b7376/flink-formats/flink-json/src/main/java/org/apache/flink/forma

Re: Flink 1.9 measuring time taken by each operator in DataStream API

2019-10-25 Thread Fabian Hueske
Hi Komal, Measuring latency is always a challenge. The problem here is that your functions are chained, meaning that the result of a function is directly passed on to the next function and only when the last function emits the result, the first function is called with a new record. This makes meas

Re: flink 1.9

2019-10-18 Thread Gezim Sejdiu
Hi Chesnay, I see. Many thanks for your prompt reply. Will make us of flink-shaded-hadoop-uber jar when deploying Flink using Docker starting from Flink v.1.8.0. Best regards, On Fri, Oct 18, 2019 at 1:30 PM Chesnay Schepler wrote: > We will not release Flink version bundling Hadoop. > > The v

Re: flink 1.9

2019-10-18 Thread Chesnay Schepler
We will not release Flink version bundling Hadoop. The versioning for flink-shaded-hadoop-uber is entirely decoupled from Flink version. You can just use the flink-shaded-hadoop-uber jar linked on the downloads page with any Flink version. On 18/10/2019 13:25, GezimSejdiu wrote: Hi Flink com

Re: flink 1.9

2019-10-18 Thread GezimSejdiu
Hi Flink community, I'm aware of the split done for binary sources of Flink starting from Flink 1.8.0 version, i.e there are no hadoop-shaded binaries available on apache dist. archive: https://archive.apache.org/dist/flink/flink-1.8.0/. Are there any plans to move the hadoop-pre-build binaries t

Re: flink 1.9

2019-10-09 Thread Vishal Santoshi
Thanks a lot. On Wed, Oct 9, 2019, 8:55 AM Chesnay Schepler wrote: > Java 11 support will be part of Flink 1.10 (FLINK-10725). You can take the > current master and compile&run it on Java 11. > > We have not investigated later Java versions yet. > On 09/10/2019 14:14, Vishal Santoshi wrote: > >

Re: flink 1.9

2019-10-09 Thread Chesnay Schepler
Java 11 support will be part of Flink 1.10 (FLINK-10725). You can take the current master and compile&run it on Java 11. We have not investigated later Java versions yet. On 09/10/2019 14:14, Vishal Santoshi wrote: Thank you. A related question, has flink been tested with jdk11 or above. ? O

Re: flink 1.9

2019-10-09 Thread Vishal Santoshi
Thank you. A related question, has flink been tested with jdk11 or above. ? On Tue, Oct 8, 2019, 5:18 PM Steven Nelson wrote: > > https://flink.apache.org/downloads.html#apache-flink-190 > > > Sent from my iPhone > > On Oct 8, 2019, at 3:47 PM, Vishal Santoshi > wrote: > > where do I get the c

Re: flink 1.9

2019-10-08 Thread Steven Nelson
https://flink.apache.org/downloads.html#apache-flink-190 Sent from my iPhone > On Oct 8, 2019, at 3:47 PM, Vishal Santoshi wrote: > > where do I get the corresponding jar for 1.9 ? > > flink-shaded-hadoop2-uber-2.7.5-1.8.0.jar > > Thanks..

Re: Flink 1.9, MapR secure cluster, high availability

2019-09-19 Thread Stephan Ewen
Hi! Not sure what is happening here. - I cannot understand why MapR FS should use Flink's relocated ZK dependency - It might be that it doesn't and that all the logging we see probably comes from Flink's HA services. Maybe the MapR stuff uses a different logging framework and the logs do not

Re: Flink 1.9, MapR secure cluster, high availability

2019-09-16 Thread Maxim Parkachov
Hi Stephan, sorry for the late answer, didn't have access to cluster. Here is log and stacktrace. Hope this helps, Maxim. - 2019-09-16 18:00:31,804 INFO org.apache.fli

Re: [flink-1.9] how to read local json file through Flink SQL

2019-09-08 Thread Anyang Hu
Hi Wesley, This is not the way I want, I want to read local json data in Flink SQL by defining DDL. Best regards, Anyang Wesley Peng 于2019年9月8日周日 下午6:14写道: > On 2019/9/8 5:40 下午, Anyang Hu wrote: > > In flink1.9, is there a way to read local json file in Flink SQL like > > the reading of csv f

Re: [flink-1.9] how to read local json file through Flink SQL

2019-09-08 Thread Wesley Peng
On 2019/9/8 5:40 下午, Anyang Hu wrote: In flink1.9, is there a way to read local json file in Flink SQL like the reading of csv file? hi, might this thread help you? http://mail-archives.apache.org/mod_mbox/flink-dev/201604.mbox/%3cCAK+0a_o5=c1_p3sylrhtznqbhplexpb7jg_oq-sptre2neo...@mail.gmail.

Re: Flink 1.9, MapR secure cluster, high availability

2019-08-30 Thread Stephan Ewen
Could you share the stack trace where the failure occurs, so we can see why the Flink ZK is used during MapR FS access? /CC Till and Tison - just FYI On Fri, Aug 30, 2019 at 9:40 AM Maxim Parkachov wrote: > Hi Stephan, > > With previous versions, I tried around 1.7, I always had to compile MapR

Re: Flink 1.9, MapR secure cluster, high availability

2019-08-30 Thread Maxim Parkachov
Hi Stephan, With previous versions, I tried around 1.7, I always had to compile MapR hadoop to get it working. With 1.9 I took hadoop-less Flink, which worked with MapR FS until I switched on HA. So it is hard to say if this is regression or not. The error happens when Flink tries to initialize B

Re: Flink 1.9, MapR secure cluster, high availability

2019-08-29 Thread Stephan Ewen
Hi Maxim! The change of the MapR dependency should not have an impact on that. Do you know if the same thing worked in prior Flink versions? Is that a regression in 1.9? The exception that you report, is that from Flink's HA services trying to connect to ZK, or from the MapR FS client trying to c

Re: Flink 1.9 build failed

2019-08-26 Thread Eliza
Hi on 2019/8/27 11:35, Simon Su wrote: Could not resolve dependencies for project org.apache.flink:flink-s3-fs-hadoop:jar:1.9-SNAPSHOT: Could not find artifact org.apache.flink:flink-fs-hadoop-shaded:jar:tests:1.9-SNAPSHOT in maven-ali (http://maven.aliyun.com/nexus/content/groups/public/) A