Hi All,

I have a simple partition write like below:

df = spark.read.parquet('read-location')
df.write.partitionBy('col1').mode('overwrite').parquet('write-location')

this fails after an hr with "file already exists (in .staging directory)"
error. Not sure what am I doing wrong here..

-- 
Regards,

Rishi Shah

Reply via email to