Re: Flink Dataset to ParquetOutputFormat

2020-01-17 Thread Arvid Heise
Hi Anuj, as far as I know, there is nothing like that on the Dataset side. Could you implement your query on Datastream with bounded inputs? In the long term, Dataset API should be completely replaced with Datastream API. Best, Arvid On Thu, Jan 16, 2020 at 12:35 PM aj wrote: > Hi Arvid, > >

Re: Flink Dataset to ParquetOutputFormat

2020-01-16 Thread aj
Hi Arvid, Thanks for the details reply. I am using Dataset API and its a batch job so wondering is the option you provided is works for that. Thanks, Anuj On Wed, Jan 8, 2020 at 7:01 PM Arvid Heise wrote: > Hi Anji, > > StreamingFileSink has a BucketAssigner that you can use for that purpose.

Re: Flink Dataset to ParquetOutputFormat

2020-01-08 Thread Arvid Heise
Hi Anji, StreamingFileSink has a BucketAssigner that you can use for that purpose. >From the javadoc: The sink uses a BucketAssigner to determine in which bucket directory each element should be written to inside the base directory. The BucketAssigner can, for example, use time or a property of t

Re: Flink Dataset to ParquetOutputFormat

2019-12-26 Thread vino yang
Hi Anji, Actually, I am not familiar with how to partition via timestamp. Flink's streaming BucketingSink provides this feature.[1] You may refer to this link and customize your sink. I can ping a professional committer who knows more detail of FS connector than me, @kklou...@gmail.com may give

Re: Flink Dataset to ParquetOutputFormat

2019-12-26 Thread aj
Thanks Vino. I am able to write data in parquet now. But now the issue is how to write a dataset to multiple output path as per timestamp partition. I want to partition data on date wise. I am writing like this currently that will write to single output path. DataSet> df = allEvents.flatMap(new

Re: Flink Dataset to ParquetOutputFormat

2019-12-22 Thread vino yang
Hi Anuj, After searching in Github, I found a demo repository about how to use parquet in Flink.[1] You can have a look. I can not make sure whether it is helpful or not. [1]: https://github.com/FelixNeutatz/parquet-flinktacular Best, Vino aj 于2019年12月21日周六 下午7:03写道: > Hello All, > > I am ge

Flink Dataset to ParquetOutputFormat

2019-12-21 Thread aj
Hello All, I am getting a set of events in JSON that I am dumping in the hourly bucket in S3. I am reading this hourly bucket and created a DataSet. I want to write this dataset as a parquet but I am not able to figure out. Can somebody help me with this? Thanks, Anuj