Hi, *Is there a way to insert data into existing parquet file using spark ?*
I am using spark stream and spark sql to store store real time data into parquet files and then query it using impala. spark creating multiple sub directories of parquet files and it make me challenge while loading it to impala. I want to insert data into existing parquet file instead of creating new parquet file. I have tried with INSERT statement but it makes performance too slow. Please suggest is there any way to insert or append data into existing parquet file. Regards, Rafeeq S *(“What you do is what matters, not what you think or say or plan.” )*
