Hi Min,

The only requirement is that your state descriptors be configured
identically as those used in your datastream API. So if you registered
custom TypeInformation / serializer in your streaming job you will need
those here as well. I would also look at the ExecutionConfig on your
DataStream app as that can dictate how your serializers are configured.

Seth

On Tue, Jun 1, 2021 at 10:24 AM Till Rohrmann <trohrm...@apache.org> wrote:

> Hi Min,
>
> Usually, you should be able to provide type information and thereby a
> serializer via the StateDescriptors which you create to access the state.
> If this is not working, then you need to give us a bit more context to
> understand what's not working.
>
> I am also pulling in Seth who is the original author of the state
> processor API.
>
> Cheers,
> Till
>
> On Mon, May 31, 2021 at 4:00 PM Tan, Min <min....@ubs.com> wrote:
>
>> Hi,
>>
>>
>>
>> I am using Flink 1.10.1 and try to read the flink states from a savepoint
>> using Flink state processor API.
>>
>> It works well when state types are the normal Java type or Java POJOs.
>>
>>
>>
>> When Avro generated Java classes are used as the state type, it does not
>> read any states anymore.
>>
>>
>>
>> Are any additional customer serializers required in this situation?
>>
>>
>>
>> Regards,
>>
>> Min
>>
>>
>>
>>
>>
>

Reply via email to