Hi all, Looking to use Pyflink to work with some scala-defined objects being emitted from a custom source. When trying to manipulate the objects in a pyflink defined MapFunction <https://ci.apache.org/projects/flink/flink-docs-release-1.12/api/python/pyflink.datastream.html#pyflink.datastream.MapFunction>, I'm hitting an error like:
Caused by: java.lang.UnsupportedOperationException: The type information: Option[<...>$Record(id: Long, created_at: Option[Long], updated_at: Option[Long])] is not supported in PyFlink currently. The scala object is defined something like: ``` object <...> { case class Record( id: Long, created_at: Option[Long], updated_at: Option[Long], ... ) } ``` The pyflink code is something like: ``` class Mutate(MapFunction): def map(self,value): print(value.id) value.id = 123 ... records = env.add_source(..) records = records.map(Mutate() ``` Can you provide any advice on how to work with these kinds of objects in Pyflink? Thanks in advance!