I'm having the same problem, did you solve this?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/CREATE-TABLE-ignores-database-when-using-PARQUET-option-tp22824p24679.html
Sent from the Apache Spark User List mailing list archive at Nabbl
nsetSerDe,
>>>> parameters:{serialization.format=1, path=/user/foo/data.parquet}),
>>>> bucketCols:[], sortCols:[], parameters:{},
>>>> skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[],
>>>> skewedColValueLocationMaps:{})), partitionKeys:[],
eters:{EXTERNAL=TRUE, spark.sql.sources.provider=parquet},
>>> viewOriginalText:null, viewExpandedText:null, tableType:EXTERNAL_TABLE)
>>>
>>> 15/05/08 20:42:21 ERROR hive.log: Got exception:
>>> org.apache.hadoop.security.AccessControlException Permission de
-
>>
>>
>> The permission error above happens because my linux user does not have
>> write
>> access on the default metastore path. I can workaround this issue if I use
>> CREATE TEMPORARY TABLE and have no metadata written on disk.
>>
&g
gt; access on the default metastore path. I can workaround this issue if I use
> CREATE TEMPORARY TABLE and have no metadata written on disk.
>
> I would like to know if I am doing anything wrong here and if there is any
> additional property I can use to force the database/metastore_di
TEMPORARY TABLE and have no metadata written on disk.
I would like to know if I am doing anything wrong here and if there is any
additional property I can use to force the database/metastore_dir I need to
write on.
Thanks,
Carlos
--
View this message in context:
http://apache-spark-user-lis