Hi,
I wanted to know if we can write streaming data to S3 in parquet format
with partitioning.
Here's what I want to achieve:
I have a kafka table which gets updated with the data from kafka topic and
I'm using select statement to get the data into a Table and converting into
a stream as:

StreamTableEnvironment tableEnv = StreamTableEnvironment.create(env);
Table table = tableEnv.sqlQuery("Select * from test");
DataStream<Row> stream = tableEnv.toDataStream(table);

Now I want to write this stream to S3 in parquet files with hourly
partitions.

Here are my questions:
1. Is this possible?
2. If yes, how it can be achieved or link to appropriate documentation.

Thanks and Regards,
Harshvardhan

Reply via email to