Hi Nick,

Can you explain why it is required to package flink-core into your
application jar? Usually flink-core is a dependency with provided
scope [1]

Best,
Gary

[1] 
https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope

On Tue, May 12, 2020 at 5:41 PM Nick Bendtner <buggi...@gmail.com> wrote:
>
> Hi Gary,
> Thanks for the info. I am aware this feature is available in 1.9.0 onwards. 
> Our cluster is still very old and have CICD challenges,I was hoping not to 
> bloat up the application jar by packaging even flink-core with it. If its not 
> possible to do this with older version without writing our own kafka sink 
> implementation similar to the flink provided version in 1.9.0 then I think we 
> will pack flink-core 1.9.0 with the application and follow the approach that 
> you suggested. Thanks again for getting back to me so quickly.
>
> Best,
> Nick
>
> On Tue, May 12, 2020 at 3:37 AM Gary Yao <g...@apache.org> wrote:
>>
>> Hi Nick,
>>
>> Are you able to upgrade to Flink 1.9? Beginning with Flink 1.9 you can use
>> KafkaSerializationSchema to produce a ProducerRecord [1][2].
>>
>> Best,
>> Gary
>>
>> [1] https://issues.apache.org/jira/browse/FLINK-11693
>> [2] 
>> https://ci.apache.org/projects/flink/flink-docs-release-1.9/api/java/org/apache/flink/streaming/connectors/kafka/KafkaSerializationSchema.html
>>
>> On Mon, May 11, 2020 at 10:59 PM Nick Bendtner <buggi...@gmail.com> wrote:
>> >
>> > Hi guys,
>> > I use 1.8.0 version for flink-connector-kafka. Do you have any 
>> > recommendations on how to produce a ProducerRecord from a kafka sink. 
>> > Looking to add support to kafka headers therefore thinking about 
>> > ProducerRecord. If you have any thoughts its highly appreciated.
>> >
>> > Best,
>> > Nick.

Reply via email to