Hi, 

Your code is working fine in my computer. What is the Flink version you are 
using.




> On 12 May 2022, at 3:39 AM, Zain Haider Nemati <zain.hai...@retailo.co> wrote:
> 
> Hi Folks,
> Getting this error when sinking data to a firehosesink, would really 
> appreciate some help !
> 
> DataStream<String> inputStream = env.addSource(new 
> FlinkKafkaConsumer<>("xxx", new SimpleStringSchema(), properties));
> 
> Properties sinkProperties = new Properties();
>             sinkProperties.put(AWSConfigConstants.AWS_REGION, "xxx");
>             sinkProperties.put(AWSConfigConstants.AWS_ACCESS_KEY_ID, "xxx");
>             sinkProperties.put(AWSConfigConstants.AWS_SECRET_ACCESS_KEY, 
> "xxx");
>             KinesisFirehoseSink<String> kdfSink = 
> KinesisFirehoseSink.<String>builder()
>                     .setFirehoseClientProperties(sinkProperties)  
>                     .setSerializationSchema(new SimpleStringSchema()) 
>                     .setDeliveryStreamName("xxx")    
>                     .setMaxBatchSize(350)                            
>                     .build();
> inputStream.sinkTo(kdfSink);
> 
> incompatible types: 
> org.apache.flink.connector.firehose.sink.KinesisFirehoseSink<java.lang.String>
>  cannot be converted to 
> org.apache.flink.api.connector.sink.Sink<java.lang.String,?,?,?>

Reply via email to