s issue? Could you try to create and write to
> same file (same directory) in some other way (manually?), using the same
> user and the same machine as would Flink job do?
>
> Maybe there will be some hint in hdfs logs?
>
> Piotrek
>
> On 12 Oct 2017, at 00:19, Isuru Suriarachchi w
Hi all,
I'm just trying to use an HDFS file as the sink for my flink stream job. I
use the following line to do so.
stream.writeAsText("hdfs://hadoop-master:9000/user/isuru/foo");
I have not set "fs.hdfs.hadoopconf" in my flink configuration as it should
work with the full hdfs file name accord
7;s setup or create another topic to try this out.
>
> Hope this will help you.
>
> Best Regards,
> Tony Wei
>
> 2017-08-29 12:26 GMT+08:00 Isuru Suriarachchi :
>
>> Hi all,
>>
>> I'm trying to implement a Flink consumer which consumes a Kafka topic
>
Hi all,
I'm trying to implement a Flink consumer which consumes a Kafka topic with
3 partitions. I've set the parallelism of the execution environment to 3 as
I want to make sure that each Kafka partition is consumed by a separate
parallel task in Flink. My first question is whether it's always gu