Interesting use case.

Can you please elaborate more on this.
On what criteria do you want to batch? Time? Count? Or Size?
On Thu, 14 Jan 2021 at 12:15 PM, sagar <sagarban...@gmail.com> wrote:

> Hi Team,
>
> I am getting the following error while running DataStream API in with
> batch mode with kafka source.
> I am using FlinkKafkaConsumer to consume the data.
>
> Caused by: java.lang.IllegalStateException: Detected an UNBOUNDED source
> with the 'execution.runtime-mode' set to 'BATCH'. This combination is not
> allowed, please set the 'execution.runtime-mode' to STREAMING or AUTOMATIC
> at org.apache.flink.util.Preconditions.checkState(Preconditions.java:198)
> ~[flink-core-1.12.0.jar:1.12.0]
>
> In my batch program I wanted to work with four to five different stream in
> batch mode as data source is bounded
>
> I don't find any clear example of how to do it with kafka souce with Flink
> 1.12
>
> I don't want to use JDBC source as underlying database table may change.
> please give me some example on how to achieve the above use case.
>
> Also for any large bounded source are there any alternatives to
> achieve this?
>
>
>
> --
> ---Regards---
>
>   Sagar Bandal
>
> This is confidential mail ,All Rights are Reserved.If you are not intended
> receipiant please ignore this email.
>

-- 
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------
**IMPORTANT**: The contents of this email and any attachments are 
confidential and protected by applicable laws. If you have received this 
email by mistake, please (i) notify the sender immediately; (ii) delete it 
from your database; and (iii) do not disclose the contents to anyone or 
make copies thereof. Razorpay accepts no liability caused due to any 
inadvertent/ unintentional data transmitted through this email.
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------

Reply via email to