while job
spec is parsed when starting the job application.
On Tue, Nov 8, 2022 at 3:27 AM liuxiangcao wrote:
> Hi Gyula,
>
> Thanks for getting back. Could you share how to submit job to
> flinkk8operator in json format?
>
> We use the java Fabric8 K8 client, whi
our yaml?
>>>
>>> It’s possible that this is not operator problem. Did you try submitting
>>> the deployment in json format instead?
>>>
>>> If it still doesn't work please open a JIRA ticket with the details to
>>> reproduce and what you have tr
Hi,
We have a job that contains `#` as part of mainArgs and it used to work on
Ververica. Now we are switching to our own control plane to deploy to
flink-operaotor and the job started to fail due to the main args string
getting truncated at `#` character when passed to flink application. I
believ
Hi Flink community,
According to flink doc, avro-confluent([1]) is only supported for kafka sql
connector and upsert kafka sql connector.
I'm wondering if there is any reason this format is not supported for
Filesystem sql connector ([2]) ?
We are looking to use FileSystem sink to write to s3 i
/FLINK-22804
> Maybe you can open a jira again to discuss the behavior you expect.
>
> 在 2022-04-29 13:30:34,"liuxiangcao" 写道:
>
> Hi Shengkai,
>
> Thank you for the reply.
>
> The UDF getEventTimeInNS uses timestamps of both streamA and streamB to
> calcu
IEW. But Flink doesn't support it now.
>
> Best,
> Shengkai
>
>
>
>
> liuxiangcao 于2022年4月16日周六 03:07写道:
>
>> Hi Flink community,
>>
>> *Here is the context: *
>> Theoretically, I would like to write following query but it won't wor
Hi Flink community,
*Here is the context: *
Theoretically, I would like to write following query but it won't work
since we can only define the WATERMARK in a table DDL:
INSERT into tableC
select tableA.field1
SUM(1) as `count`,
time_ltz AS getEventTimeInNS(tableA.timestamp, tab
events will always be dropped.
>>
>> You can see this link as reference:
>> https://stackoverflow.com/questions/60218235/using-event-time-with-lateness-in-flink-sql-windows
>>
>>
>>
>> > On 31 Mar 2022, at 5:38 AM, liuxiangcao
>> wro
Hi Flink community,
In Flink DataStream Java API, user can get get data that was discarded as
late using WindowedStream.sideOutputLateData(OutputTag) (see [1]). I'm
wondering what is the best way for user to achieve this in Flink SQL?
For background, we are providing pure sql deployment to our