Yes, as mentioned in the first email, what I want is something like spark
transform.
But I found that lambda function is not supported in Calcite
https://issues.apache.org/jira/browse/CALCITE-3679, so it may be hard to do the
same thing in Flink.
I'll write some customized UDF for my require
%3D0&mail=baixuekui%40foxmail.com&code=N8Cet_EMVMnF7HKcDtoXrjLoOIaR6L1TssSS8JERSE6kMP-z4kIBPOANAJykWxtHUxbc3pFXPCMY0GPECwMqimaIEv4aD6d01ucP-vR-UH0>
>>
>>
>>
>>
>> Original Email
>>
>> Sender:"Shammon FY"< zjur...@gmail.com &g
; To:"yuxia"< luoyu...@alumni.sjtu.edu.cn >;
>
> Cc recipient:"Xuekui"< baixue...@foxmail.com >;"fskmine"<
> fskm...@gmail.com >;"Caizhi Weng"< tsreape...@gmail.com >;"User"<
> user@flink.apache.org >;
>
Hi Yuxia and Shammon,
Thanks for your reply.
The requirements is dynamic in my case. If I move the logic into udf, it's not
flexiable.
For example, there's one users column in my talbe whose type is Rowhttps://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/functions/udfs/#type-i
apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:660)
>
>
>
>
>
>
> Original Email
>
> Sender:"yuxia"< luoyu...@alumni.sjtu.edu.cn >;
>
> Sent Time:2023/2/20 10:00
>
> To:"Xuekui"< baixue...@foxma
xia"
抄送: "fskmine" , "Caizhi Weng" , "User"
发送时间: 星期二, 2023年 2 月 21日 上午 11:25:48
主题: Re:Re: Flink SQL support array transform function
Hi YuXia,
Thanks for your advice.
By adding the hint, the type validation can pass.
But still I can't pass the f
xia"< luoyu...@alumni.sjtu.edu.cn >;
Sent Time:2023/2/20 10:00
To:"Xuekui"< baixue...@foxmail.com >;
Cc recipient:"fskmine"< fskm...@gmail.com >;"Caizhi Weng"<
tsreape...@gmail.com >;"User"< user@flink.apache.org >;
/functions/udfs/#type-inference
]
Best regards,
Yuxia
发件人: "Xuekui"
收件人: "fskmine" , "Caizhi Weng"
抄送: "User"
发送时间: 星期四, 2023年 2 月 16日 上午 10:54:05
主题: Re: Flink SQL support array transform function
Hi Caizhi,
I've tried to write UDF to su
Hi Caizhi,
I've tried to write UDF to support this function, but I found I can't pass the
function parameter to udf because the data type of function is not supported.
An exception throws in SQL validation.
My UDF code:
class ArrayTransformFunction extends ScalarFunction {
def eval(a: Arra
Hi, Caizhi. Do you think we should support this? Maybe we can open a jira
for this or to align with the spark to support more useful built-in
functions.
Caizhi Weng 于2021年8月3日周二 下午3:42写道:
> Hi!
>
> Currently there is no such built-in function in Flink SQL. You can try to
> write your own user-
Hi!
Currently there is no such built-in function in Flink SQL. You can try to
write your own user-defined function[1] to achieve this.
[1]
https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/table/functions/udfs/
Xuekui 于2021年8月3日周二 下午3:22写道:
> Hi,
>
> I'm using Flink SQL and
Hi,
I'm using Flink SQL and need to do some transformation for one array column,
just like spark sql transform function.
https://spark.apache.org/docs/latest/api/sql/index.html#transform
I found it's not supported by Flink SQL , is there any plan for it?
Thank you
12 matches
Mail list logo