The error you're seeing typically means that you cannot connect to the Hive
metastore itself. Some quick thoughts:
- If you were to run "show tables" (instead of the CREATE TABLE statement),
are you still getting the same error?
- To confirm, the Hive metastore (MySQL database) is up and running
Hi Denny,
Still facing the same issue.Please find the following errors.
*scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)*
*sqlContext: org.apache.spark.sql.hive.HiveContext =
org.apache.spark.sql.hive.HiveContext@4e4f880c*
*scala> sqlContext.sql("CREATE TABLE IF NOT EXISTS
No I am just running ./spark-shell command in terminal I will try with
above command
On Wed, Mar 25, 2015 at 11:09 AM, Denny Lee wrote:
> Did you include the connection to a MySQL connector jar so that way
> spark-shell / hive can connect to the metastore?
>
> For example, when I run my spark-sh
Did you include the connection to a MySQL connector jar so that way
spark-shell / hive can connect to the metastore?
For example, when I run my spark-shell instance in standalone mode, I use:
./spark-shell --master spark://servername:7077 --driver-class-path
/lib/mysql-connector-java-5.1.27.jar
Hi Sparkers,
Can anyone please check the below error and give solution for this.I am
using hive version 0.13 and spark 1.2.1 .
Step 1 : I have installed hive 0.13 with local metastore (mySQL database)
Step 2: Hive is running without any errors and able to create tables and
loading data in hive t
I was actually just able to reproduce the issue. I do wonder if this is a
bug -- the docs say "When not configured by the hive-site.xml, the context
automatically creates metastore_db and warehouse in the current directory."
But as you can see in from the message warehouse is not in the current
di
Hi yana,
I have removed hive-site.xml from spark/conf directory but still getting
the same errors. Anyother way to work around.
Regards,
Sandeep
On Fri, Feb 27, 2015 at 9:38 PM, Yana Kadiyska
wrote:
> I think you're mixing two things: the docs say "When* not *configured by
> the hive-site.xml,
I think you're mixing two things: the docs say "When* not *configured by
the hive-site.xml, the context automatically creates metastore_db and
warehouse in the current directory.". AFAIK if you want a local metastore,
you don't put hive-site.xml anywhere. You only need the file if you're
going to p
Hi Sparkers,
I am using hive version - hive 0.13 and copied hive-site.xml in spark/conf
and using default derby local metastore .
While creating a table in spark shell getting the following error ..Can any
one please look and give solution asap..
sqlContext.sql("CREATE TABLE IF NOT EXISTS sandee
Hi All,
I’m trying to write streaming processed data in HDFS (Hadoop 2). The buffer is
flushed and closed after each writing. The following errors occurred when
opening the same file to append. I know for sure the error is caused by closing
the file. Any idea?
Here is the code to write HDFS
10 matches
Mail list logo