Hi,
Row is very special data type, because Flink cannot extract the field
types automatically based on Java generics. By default it is serialized
by Kryo, you need to specify the field types using
Typles.ROW(Types.STRING, ...) and pass this information in your
`.returns()` methods instead of `Row.class`.
Lambdas are sometimes a problem for Flink, see
https://ci.apache.org/projects/flink/flink-docs-release-1.4/dev/java8.html.
In your example it might makes sense to use the Table API for
pre-processing. The Table API has a built-in CSV input format and deals
with all the type handling for you. You can convert a table back to
DataStream.
Regards,
Timo
Am 05.08.17 um 13:49 schrieb Егор Литвиненко:
> No errors, and no
data in table.
2017-08-05 14:49 GMT+03:00 Егор Литвиненко <e.v.litvinenk...@gmail.com
<mailto:e.v.litvinenk...@gmail.com>>:
Hi
I try to test Flink and have a problems.
Why this example doesn't work -
https://github.com/egorlitvinenko/testparsing/blob/master/test-flink/src/main/java/org/egorlitvinenko/testflink/StreamingJob9C.java
<https://github.com/egorlitvinenko/testparsing/blob/master/test-flink/src/main/java/org/egorlitvinenko/testflink/StreamingJob9C.java>
Logs - https://pastebin.com/iuBZhfeG <https://pastebin.com/iuBZhfeG>
No errors, and no
One more question. Examples intentionally use anonymous classes,
instead of lambdas. Because with lambda it also doesn't work. I
did small investigation and found out that Flink has different
processing for lambdas. And in one moment when Flink processes
types, they are empty. Do you have full lambdas support?
In best regards, Egor Litvinenko.