Hi Weston,

Sorry for the late reply. For using S3 in pyarrow, there are indeed 2
options: using the implementation provided by arrow
(`pyarrow.fs.S3FileSystem`) or using s3fs which gets wrapped by
pyarrow.
Note that the wrapper is not actually DaskFileSystem: for the legacy
filesystems we use s3fs directly, for the new filesystems it gets
wrapped using `FSSpecHandler`.

Both options are supported going forward. It might that at some point
the built-in one will be more tightly integrated, since for s3fs it is
being used through a generic wrapper.

Best,
Joris

On Wed, 19 Aug 2020 at 23:18, Weston Pace <weston.p...@gmail.com> wrote:
>
> To use S3 it appears I can either use `pyarrow.fs.S3FileSystem` or I
> can use s3fs and it gets wrapped (I think) with
> `pyarrow.fs.DaskFileSystem`.  However, I don't see any documentation
> for `pyarrow.fs.DaskFileSystem`.  Is this option supported going
> forwards?  I'm currently configuring an s3fs instance for S3 access
> elsewhere and so I'd rather reuse this if possible.
>
> -Weston Pace

Reply via email to