Hi guys

Is it possible to add a new partition to a persistent table using Spark SQL
? The following call works and data gets written in the correct
directories, but no partition metadata is not added to the Hive metastore.
In addition I see nothing preventing any arbitrary schema being appended to
the existing table.

    eventsDataFrame.write.mode(SaveMode.Append).partitionBy("
windows_event_time_bin").saveAsTable("windows_event")

sqlContext.sql("show partitions windows_event")

Does SparkSQL not need partition metadata when reading data back?

Regards
Deenar

Reply via email to