On Fri, 2020-10-09 at 19:10 +, Ross Vandegrift wrote:
> Starting today, running a beam pipeline triggers a large reinstallation of
> python modules. For some reason, it forces full rebuilds from source -
> since
> beam depends on numpy, this takes a long time.
I opened a support
Hello,
Starting today, running a beam pipeline triggers a large reinstallation of
python modules. For some reason, it forces full rebuilds from source - since
beam depends on numpy, this takes a long time.
There's nothing strange about my python setup. I'm using python3.7 on debian
buster with
le default]\n')
f.write('role_arn = your-role-arn\n')
f.write('web_identity_token_file = /tmp/id_token\n')
You need to sub appropriate values for 'your-audience' and 'your-role-arn'.
Ross
On Thu, 2020-10-01 at 15:47 +, Ross Vandegrift wr
> worker at startup.
>
> On Wed, Sep 30, 2020, 10:16 AM Ross Vandegrift <
> ross.vandegr...@cleardata.com> wrote:
> > I see - it'd be great if the s3 io code would accept a boto session, so
> > the
> > default process could be overridden.
> >
>
.
> I wonder if you can use the setup.py file to add the default configuration
> yourself while we have appropriate support for a pipeline option-based
> authentication. Could you try adding this default config on setup.py?
> Best
> -P.
>
> On Tue, Sep 29, 2020 at 11:16 AM Ross Van
Hello all,
I have a python pipeline that writes data to an s3 bucket. On my laptop it
picks up the SDK credentials from my boto3 config and works great.
Is is possible to provide credentials explicitly? I'd like to use remote
dataflow runners, which won't have implicit AWS credentials available