Hi everyone,

I am trying to save a data frame with special characters in the column
names as a table in hive. However I am getting the following error. Is the
only solution to rename all the columns? Or is there some argument that can
be passed to into the saveAsTable() or write.parquet() functions to ignore
special characters?

Py4JJavaError: An error occurred while calling o2956.saveAsTable.
: org.apache.spark.sql.AnalysisException: Attribute name "apple-
mail_duration" contains invalid character(s) among " ,;{}()\n\t=".
Please use alias to rename it.


If not how can I simply write the data frame as a csv file?

Thanks,
Tobi

Reply via email to