Thanks for the reply Shammom. I looked at the DataStreamCsvITCase - it gives a 
very good example. I can implement something similar. However, the 
CSVBulkWriter that is uses to create a factory, has a default package access 
which can be accessed from this test case, but not from my application. 
Should I just replicate this as a public class in my API? If this class is 
intended to use as a CSVBulkWriter, should this be public in flink-csv?
Thanks
    On Wednesday, 8 March, 2023 at 07:11:19 am IST, Shammon FY 
<zjur...@gmail.com> wrote:  
 
 Hi
You can create a `BulkWriter.Factory` which will create `CsvBulkWriter` and 
create `FileSink` by `FileSink.forBulkFormat`. You can see the detail in 
`DataStreamCsvITCase.testCustomBulkWriter`
Best,Shammon

On Tue, Mar 7, 2023 at 7:41 PM Chirag Dewan via user <user@flink.apache.org> 
wrote:

Hi,
I am working on a Java DataStream application and need to implement a File sink 
with CSV format.
I see that I have two options here - Row and Bulk 
(https://nightlies.apache.org/flink/flink-docs-release-1.16/docs/connectors/datastream/filesystem/#format-types-1)
So for CSV file distribution which one should I use? Row or Bulk?
I think the documentation is confusing for File connectors. Because I can see 
an example for PyFlink which uses a BulkWriter for CSV. But the same class is 
not public in flink-csv. So does Flink not support CSVBulkWriter for Java?
And for Table API File sink explicitly supports CSV for Row format. But fails 
to mention anything about CSV in DataStream File sink. 
This all is just really confusing. Any leads on this are much appreciated.
Thanks
  

Reply via email to