Just adding user@flink.apache.org

On Fri, 21 Mar 2025 at 19:08, Kirill Lyashko <kirill.v.lyas...@gmail.com>
wrote:

> Ok, I'll ask them as well, thanks.
>
> On Fri, 21 Mar 2025 at 18:16, Alexey Novakov <ale...@ververica.com> wrote:
>
>> This is really weird.
>>
>> I would recommend you also to ask on Flink Dev List. This can be an
>> outdated Flink documentation, which shows that multiple statements were
>> supported in the past or so.
>>
>> On Fri, Mar 21, 2025 at 4:35 PM Kirill Lyashko <
>> kirill.v.lyas...@gmail.com> wrote:
>>
>>> This documentation shows such example:
>>>
>>> https://nightlies.apache.org/flink/flink-docs-release-1.19/docs/dev/table/sql/delete/#delete-statements
>>>
>>> No, Statement Set not solving the issue as well, I have this problem:
>>>
>>> Caused by: org.apache.flink.table.api.TableException: Unsupported SQL 
>>> query! Only accept a single SQL statement of type DELETE.
>>> at 
>>> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:856)
>>>  ~[flink-table-api-java-uber-1.19.1.jar:1.19.1]
>>> at 
>>> org.apache.flink.table.api.internal.StatementSetImpl.execute(StatementSetImpl.java:109)
>>>  ~[flink-table-api-java-uber-1.19.1.jar:1.19.1]
>>>
>>> What is similar to multiple queries joined with ';' in one string.
>>>
>>> Ok, looks like it's not supported, thanks for help.
>>>
>>> On Fri, 21 Mar 2025 at 12:33, Alexey Novakov <ale...@ververica.com>
>>> wrote:
>>>
>>>> I think documentation talks more about SQL client / SQL gateway where
>>>> we can run DML statements one by one.
>>>> I have not found a clear example in Table API documentation with more
>>>> than one query execution within the same Flink application.
>>>>
>>>> Not sure about DELETEs, need to test.
>>>>
>>>> Best regards,
>>>> Alexey
>>>>
>>>>
>>>> On Fri, Mar 21, 2025 at 11:35 AM Kirill Lyashko <
>>>> kirill.v.lyas...@gmail.com> wrote:
>>>>
>>>>> Ok, but in that case the documentation is saying the opposite, no?
>>>>>
>>>>> Thanks, for recommendation, I will take a look, but from the first
>>>>> glance it looks like it supports only INSERTS, or not?
>>>>>
>>>>> On Fri, 21 Mar 2025 at 11:23, Alexey Novakov <ale...@ververica.com>
>>>>> wrote:
>>>>>
>>>>>> Hi Kirill,
>>>>>>
>>>>>> As far as I know, multiple statements cannot be run by calling an
>>>>>> execute() method on each, so you are missing nothing.
>>>>>> Try to look at the Statement Set which allows you to submit multiple
>>>>>> SQL statements within one Flink app.
>>>>>>
>>>>>> For example:
>>>>>> https://github.com/novakov-alexey/flink-ml-sandbox/blob/main/src/main/scala/com/example/customerChurn.scala#L223
>>>>>>
>>>>>> Best regards,
>>>>>> Alexey
>>>>>>
>>>>>> On Thu, Mar 20, 2025 at 4:57 PM Kirill Lyashko <
>>>>>> kirill.v.lyas...@gmail.com> wrote:
>>>>>>
>>>>>>> I' m trying to submit multiple DELETE statements in to the
>>>>>>> TableEnvironment in way:
>>>>>>>
>>>>>>> val settings = EnvironmentSettings.newInstance.inBatchMode.build()
>>>>>>> val env = TableEnvironment.create(settings)
>>>>>>> createSchema(env)
>>>>>>> val queries = getDeleteQueries
>>>>>>> queries.foreach(q => env.executeSql(q).await())
>>>>>>>
>>>>>>> what according to documentation should just work delete-statements
>>>>>>> <https://nightlies.apache.org/flink/flink-docs-release-1.19/docs/dev/table/sql/delete/#delete-statements>.
>>>>>>> But in reality I have follow exception after the first query is 
>>>>>>> finished:
>>>>>>>
>>>>>>> Caused by: org.apache.flink.util.FlinkRuntimeException: Cannot have 
>>>>>>> more than one execute() or executeAsync() call in a single environment.
>>>>>>>     at 
>>>>>>> org.apache.flink.client.program.StreamContextEnvironment.validateAllowedExecution(StreamContextEnvironment.java:199)
>>>>>>>  ~[flink-dist-1.19.1.jar:1.19.1]
>>>>>>>     at 
>>>>>>> org.apache.flink.client.program.StreamContextEnvironment.executeAsync(StreamContextEnvironment.java:187)
>>>>>>>  ~[flink-dist-1.19.1.jar:1.19.1]
>>>>>>>     at 
>>>>>>> org.apache.flink.table.planner.delegation.DefaultExecutor.executeAsync(DefaultExecutor.java:110)
>>>>>>>  ~[?:?]
>>>>>>>     at 
>>>>>>> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1032)
>>>>>>>  ~[flink-table-api-java-uber-1.19.1.jar:1.19.1]
>>>>>>>     at 
>>>>>>> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:876)
>>>>>>>  ~[flink-table-api-java-uber-1.19.1.jar:1.19.1]
>>>>>>>     at 
>>>>>>> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1112)
>>>>>>>  ~[flink-table-api-java-uber-1.19.1.jar:1.19.1]
>>>>>>>     at 
>>>>>>> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:735)
>>>>>>>  ~[flink-table-api-java-uber-1.19.1.jar:1.19.1]
>>>>>>>
>>>>>>> When I check the code I see that there is indeed a validation that
>>>>>>> not more then 1 job can be submitted:
>>>>>>>
>>>>>>>     private void validateAllowedExecution() {
>>>>>>>     if (enforceSingleJobExecution && jobCounter > 0) {
>>>>>>>         throw new FlinkRuntimeException(
>>>>>>>                 "Cannot have more than one execute() or executeAsync() 
>>>>>>> call in a single environment.");
>>>>>>>     }
>>>>>>>     jobCounter++;
>>>>>>> }
>>>>>>>
>>>>>>> When I've tried to submit multiple statements separated by ; I've
>>>>>>> got follow exception:
>>>>>>>
>>>>>>> Caused by: java.lang.IllegalArgumentException: only single statement 
>>>>>>> supported
>>>>>>>     at 
>>>>>>> org.apache.flink.util.Preconditions.checkArgument(Preconditions.java:138)
>>>>>>>  ~[flink-dist-1.19.1.jar:1.19.1]
>>>>>>>     at 
>>>>>>> org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:104)
>>>>>>>  ~[?:?]
>>>>>>>     at 
>>>>>>> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:728)
>>>>>>>  ~[flink-table-api-java-uber-1.19.1.jar:1.19.1]
>>>>>>>
>>>>>>> My job is submitted to Kubernetes vie usage of
>>>>>>> flink-kubernetes-operator
>>>>>>> <https://nightlies.apache.org/flink/flink-kubernetes-operator-docs-release-1.10/>
>>>>>>>
>>>>>>> Flink version is 1.19.1
>>>>>>>
>>>>>>> Flink Kubernetes operator version is 1.10.0
>>>>>>>
>>>>>>> My queries look like:
>>>>>>>
>>>>>>> DELETE FROM hdfs_table
>>>>>>> WHERE id IN (
>>>>>>>   SELECT DISTINCT a.id
>>>>>>>   FROM jdbc_table_a a
>>>>>>>   JOIN jdbc_table_b b ON a.b_id = b.id
>>>>>>> WHERE b.should_be_deleted
>>>>>>> )
>>>>>>>
>>>>>>> Two questions which I would appreciate to get help with:
>>>>>>>
>>>>>>>    1. What am I missing in the configuration that's preventing me
>>>>>>>    from submitting multiple DELETE statements?
>>>>>>>    2. Is it possible to submit multiple DELETE questions in
>>>>>>>    parallel to be able to reuse jdbc tables rather than load them on 
>>>>>>> every
>>>>>>>    query?
>>>>>>>
>>>>>>>

Reply via email to