Hi Marco,
ideally you solve everything with IAM roles, but you can also use
credentials providers such as EnvironmentVariableCredentialsProvider[1].
The key should be
s3.aws.credentials.provider:
com.amazonaws.auth.EnvironmentVariableCredentialsProvider
Remember to put the respective jar into th
Is it possible to use an environmental credentials provider?
On Thu, Jan 28, 2021 at 8:35 AM Arvid Heise wrote:
> Hi Marco,
>
> afaik you don't need HADOOP_HOME or core-site.xml.
>
> I'm also not sure from where you got your config keys. (I guess from the
> Presto page, which probably all work i
Hi Marco,
afaik you don't need HADOOP_HOME or core-site.xml.
I'm also not sure from where you got your config keys. (I guess from the
Presto page, which probably all work if you remove hive., maybe we should
also support that)
All keys with prefix s3 or s3p (and fs.s3, fs.s3p) are routed towards
Hi,
I got s3a working on localstack. The missing piece of information from
Flink documentation seems to be that the system requires a HADOOP_HOME and
core-site.xml.
Flink documentation states that s3p (presto) should be used for file
checkpointing into s3. I am using RocksDB, which I assume also