cz <
>> dwysakow...@apache.org>; Taher Koitawala [via Apache Flink User Mailing
>> List archive.] ; user <
>> user@flink.apache.org>
>> *Subject:* EXT :Re: StreamingFileSink cannot get AWS S3 credentials
>>
>>
>>
>> I haven't configured t
Flink User Mailing
> List archive.] ; user <
> user@flink.apache.org>
> *Subject:* EXT :Re: StreamingFileSink cannot get AWS S3 credentials
>
>
>
> I haven't configured this myself but I would guess that you need to set
> the parameters defined here under S3A Authenticati
Kostas Kloudas ; Dawid Wysakowicz
; Taher Koitawala [via Apache Flink User Mailing List
archive.] ; user
Subject: EXT :Re: StreamingFileSink cannot get AWS S3 credentials
I haven't configured this myself but I would guess that you need to set the
parameters defined here under S3A Aut
> On Fri, Jan 11, 2019 at 5:03 PM Taher Koitawala [via Apache Flink
>>>>>> User Mailing List archive.]
>>>>>> wrote:
>>>>>>
>>>>>>> Hi All,
>>>>>>> We have implemented S3 sink in the following way:
>>>>>>>
>>
ers.forGenericRecord(schema))
>>>>>> .withBucketCheckInterval(50l).withBucketAssigner(new
>>>>>> CustomBucketAssigner()).build();
>>>>>>
>>>>>> The problem we are facing is that StreamingFileSink is initializing
>>>>>
;> ParquetAvroWriters.forGenericRecord(schema))
>>>>> .withBucketCheckInterval(50l).withBucketAssigner(new
>>>>> CustomBucketAssigner()).build();
>>>>>
>>>>> The problem we are facing is that StreamingFileSink is initializing
>>>>> S3AFileSystem class to write to s3 a
to s3 and is not able to find the s3
>>>> credentials to write data, However other flink application on the same
>>>> cluster use "s3://" paths are able to write data to the same s3 bucket and
>>>> folders, we are only facing this issue with StreamingFileSink.
&g
write to s3 and is not able to find the s3
>>>> credentials to write data, However other flink application on the same
>>>> cluster use "s3://" paths are able to write data to the same s3 bucket and
>>>> folders, we are only facing this issue wi
gt; cluster use "s3://" paths are able to write data to the same s3 bucket and
>>> folders, we are only facing this issue with StreamingFileSink.
>>>
>>> Regards,
>>> Taher Koitawala
>>> GS Lab Pune
>>> +91 8407979163
>>>
&
t;>
>> Regards,
>> Taher Koitawala
>> GS Lab Pune
>> +91 8407979163
>>
>>
>> --------------
>> If you reply to this email, your message will be added to the discussion
>> below:
>>
>> http://apache-flink-user-mailing-list-a
is issue with StreamingFileSink.
>
> Regards,
> Taher Koitawala
> GS Lab Pune
> +91 8407979163
>
>
>
> If you reply to this email, your message will be added to the
> discussion below:
>
dded to the
> discussion below:
>
> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/StreamingFileSink-cannot-get-AWS-S3-credentials-tp25464.html
>
> To start a new topic under Apache Flink User Mailing List
> archive., email ml+s233
w:
>
> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/StreamingFileSink-cannot-get-AWS-S3-credentials-tp25464.html
> To start a new topic under Apache Flink User Mailing List archive., email
> ml+s2336050n1...@n4.nabble.com
> To unsubscribe from Apache Flink User Mailin
Hi All,
We have implemented S3 sink in the following way:
StreamingFileSink sink= StreamingFileSink.forBulkFormat(new
Path("s3a://mybucket/myfolder/output/"),
ParquetAvroWriters.forGenericRecord(schema))
.withBucketCheckInterval(50l).withBucketAssigner(new
CustomBucketAssigner()).build();
14 matches
Mail list logo