Re: Apache Flink Reading CSV Files ,Transform and Writting Back to CSV using Paralliesm

2017-08-25 Thread Lokesh Gowda
Hi Robert my question was if I need to read and write the csv file of size which will be in gb how i can distribute the data sink to write into files 1gb exactly and since I am New to flink I am not sure about this Regards Lokesh.r On Sat, Aug 26, 2017 at 2:56 AM Robert Metzger wrote: > Hi

Re: Apache Flink Reading CSV Files ,Transform and Writting Back to CSV using Paralliesm

2017-08-25 Thread Robert Metzger
Hi Lokesh, I'm not sure if I fully understood your question. But you can not write the result in a single file from multiple writers. If you want to process the data fully distributed, you'll also have to write it distributed. On Wed, Aug 23, 2017 at 8:07 PM, Lokesh R wrote: > Hi Team, > > I am

Apache Flink Reading CSV Files ,Transform and Writting Back to CSV using Paralliesm

2017-08-23 Thread Lokesh R
Hi Team, I am using the apache flink with java for below problem statement 1.where i will read a csv file with field delimeter character ; 2.transform the fields 3.write back the data back to csv my doubts are as below 1. if i need to read the csv file of size above 50 gb what would be the app