It uses connectors to send data to external storages. It should be noted
that it shares the connector implementations between Java API and Python
API and so if you could find a Java connector, usually it could be also be
used in PyFlink.
For firehose, it has provided a firehose sink connector in F
Hi,
I am working on writing a flink processor which has to send transformed
data to redshift/S3.
I do not find any sort of documentation for pyflink in reference to how to
send data to firehose,s3 or redshift. Would appreciate some help here.