Hi,
I am trying to connect to spark thrift server to create an external table.
In my table DDL, I have a tbl property 'spark.sql.sources.provider' =
'parquet', but I am getting an error "Cannot persistent
into hive metastore as table property keys may not start with 'spark.sql.':
[spark.sql.sour
I tried the following code in both Spark 1.5.1 and Spark 1.6.0:
import org.apache.spark.sql.types.{
StructType, StructField, StringType, IntegerType}
import org.apache.spark.sql.Row
val schema = StructType(
StructField("k", StringType, true) ::
StructField("v", IntegerType, false) ::
Hi,
I am using hiveContext.sql("create database if not exists ") to
create a hive db. Is this statement atomic?
Thanks.
Antonio.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Is-Hive-CREATE-DATABASE-IF-NOT-EXISTS-atomic-tp26706.html
Sent from the Apache
Hi,
I am trying to connect to a hive metastore deployed in a oracle db. I have
the hive configuration
specified in the hive-site.xml. I put the hive-site.xml under
$SPARK_HOME/conf. If I run spark-shell,
everything works fine. I can create hive database, tables and query the
tables.
However, when
Hi,
I am using RDD.saveAsObjectFile() to save the RDD dataset to Tachyon. In
version 0.8, Tachyon will support for TTL for saved file. Is that supported
from Spark as well? Is there a way I could specify an TTL for a saved object
file?
Thanks.
Antonio.
--
View this message in context:
http:/