Thanks Michael!

But what about when I am not trying to save as parquet? No way around the
error using saveAsTable()? I am using Spark 1.4.

Tobi
On Jul 11, 2016 2:10 PM, "Michael Armbrust" <mich...@databricks.com> wrote:

> This is protecting you from a limitation in parquet.  The library will let
> you write out invalid files that can't be read back, so we added this check.
>
> You can call .format("csv") (in spark 2.0) to switch it to CSV.
>
> On Mon, Jul 11, 2016 at 11:16 AM, Tobi Bosede <ani.to...@gmail.com> wrote:
>
>> Hi everyone,
>>
>> I am trying to save a data frame with special characters in the column
>> names as a table in hive. However I am getting the following error. Is the
>> only solution to rename all the columns? Or is there some argument that can
>> be passed to into the saveAsTable() or write.parquet() functions to ignore
>> special characters?
>>
>> Py4JJavaError: An error occurred while calling o2956.saveAsTable.
>> : org.apache.spark.sql.AnalysisException: Attribute name "apple- 
>> mail_duration" contains invalid character(s) among " ,;{}()\n\t=". Please 
>> use alias to rename it.
>>
>>
>> If not how can I simply write the data frame as a csv file?
>>
>> Thanks,
>> Tobi
>>
>>
>>
>

Reply via email to