Hi all,
I tried to use ZipWithIndex functionality, accordingly to the Scala
examples posted here:
https://ci.apache.org/projects/flink/flink-docs-master/apis/zip_elements_guide.html
however I am not able to call the mentioned function because it cannot be
resolved. I checked the flink code for or
Hey Filip,
As you are using the scala API it is easier to use the Scala DataSet utils,
which are accessible after the following import:
import org.apache.flink.api.scala.utils._
Then you can do the following:
val indexed = data.zipWithIndex
Best,
Marton
On Sat, Dec 12, 2015 at 7:48 PM, Fili
Hi,
Can you check the log output in your IDE or the log files of the Flink
client (./bin/flink). The TypeExtractor is logging why a POJO is not
recognized as a POJO.
The log statements look like this:
20:42:43,465 INFO org.apache.flink.api.java.typeutils.TypeExtractor
- class com.dataarti
Hi Marton,
Thank you for your answer. I wasn't able to use zipWithIndex in a way that
you stated as i got "cannot resolve" error. However it worked when i used
it like this:
val utils = new DataSetUtils[AlignmentRecord](data)
val index = utils.zipWithIndex
Regards,
Filip Łęczycki
Pozdra
Hi Vladimir,
Flink is using Hadoop's S3 File System implementation. It seems that this
feature is not supported by their implementation:
https://issues.apache.org/jira/browse/HADOOP-9680
This issue contains some more information:
https://issues.apache.org/jira/browse/HADOOP-9384 It seems that the