Thank you for the tips. Unfortunately, the guides don't follow the same
case since I'm not using SQL DDL and I don't convert from DataStream to
Table, I register the DataStream as a table in the SQL API.

I have a custom DataStream source, which I have as DataStream<Row>. The
time characteristic is normalized time. I call inputStream
.assignTimestampsAndWatermarks(watermarkStrategy). Then, I call
tableEnv.registerDatastream(). In this process, I don't see where I could
set the proctime as the time attribute.

Also, I'm currently using TimeIndicatorTypeInfo.ROWTIME_INDICATOR as the
type, so I'm unsure how to cast it to TIMESTAMP(3).



On Mon, Aug 2, 2021 at 7:18 PM JING ZHANG <beyond1...@gmail.com> wrote:

> Hi Pranav,
> Yes, The root cause is the `timecol` is not a time attribute column.
> If you use processing time as time attribute, please refer [1] for more
> information.
> If you use. event time as time attribute, please refer[2] for more
> information. And only if choose event time, `assignTimestampsAndWatermarks`
> is needed.
> If the problem still exists after you update the program based on the demo
> in the document, please let me know and provide your program.
>
> [1]
> https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/table/concepts/time_attributes/#processing-time
> [2]
> https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/table/concepts/time_attributes/#event-time
>
> Best,
> JING ZHANG
>
>
> Pranav Patil <pranav.pa...@salesforce.com> 于2021年8月3日周二 上午8:51写道:
>
>> Hi,
>>
>> I'm upgrading a repository from Flink 1.11 to Flink 1.13. I have Flink
>> SQL command that used to do tumbling windows using the following in the
>> GROUP BY clause:
>>
>> SELECT ... FROM ... GROUP BY TUMBLE(proctime, INTERVAL '1' MINUTE)
>>
>> However now, it gives me the error:
>>
>> org.apache.flink.table.api.TableException: Window aggregate can only be 
>> defined over a time attribute column, but TIMESTAMP(9) encountered.
>>
>> I'm not sure why this isn't a time attribute column anymore. Thinking it's a 
>> syntax change, I try:
>>
>> Table(TUMBLE(Table inputStream,DESCRIPTOR(proctime),INTERVAL '1' MINUTE))
>>
>> This too doesn't work, with the error:
>>
>> org.apache.flink.table.api.ValidationException: The window function 
>> TUMBLE(TABLE table_name, DESCRIPTOR(timecol), datetime interval) requires 
>> the timecol is a time attribute type, but is TIMESTAMP(9).
>>
>> So I'm realizing the problem is actually that it isn't a time attribute 
>> column. However, I'm confused because this wasn't a problem in previous 
>> versions. In the DataStream API file source, assignTimestampsAndWatermarks 
>> is called on the source stream, which I believed to be enough. I then call 
>> registerDatastream to access it from Flink SQL. Are there additional steps 
>> that must be taken in Flink 1.13?
>>
>> Thank you.
>>
>>

Reply via email to