To add to this question, do I need to setup env.hadoop.conf.dir to point to
the hadoop config for instance env.hadoop.conf.dir=/etc/hadoop/ for the jvm
? Or is it possible to write to hdfs without any external hadoop config
like core-site.xml, hdfs-site.xml ?

Best,
Nick.



On Fri, Feb 28, 2020 at 12:56 PM Nick Bendtner <buggi...@gmail.com> wrote:

> Hi guys,
> I am trying to write to hdfs from streaming file sink. Where should I
> provide the IP address of the name node ? Can I provide it as a part of the
> flink-config.yaml file or should I provide it like this :
>
> final StreamingFileSink<GenericRecord> sink = StreamingFileSink
>       .forBulkFormat(hdfs://namenode:8020/flink/test, 
> ParquetAvroWriters.forGenericRecord(schema))
>
>       .build();
>
>
> Best,
> Nick
>
>
>

Reply via email to