Does SparkSQL support partitioned parquet tables? How do I save to a 
partitioned parquet file from within Python?

 table.saveAsParquetFile("table.parquet”)

This call doesn’t seem to support a partition argument. Or does my schemaRDD 
have to be setup a specific way?

Reply via email to