Hi How does one check for the presence of a partition in a Spark SQL partitioned table (save using dataframe.write.partitionedBy("partCol") not hive compatible tables), other than physically checking the directory on HDFS or doing a count(*) with the partition cols in the where clause ?
Regards Deenar