HiveContext in spark

2016-04-12 Thread Selvam Raman
I Could not able to use Insert , update and delete command in HiveContext. i am using spark 1.6.1 version and hive 1.1.0 Please find the error below. ​scala> hc.sql("delete from trans_detail where counter=1"); 16/04/12 14:58:45 INFO ParseDriver: Parsing command: delete from trans_detail wher

Re: Creating HiveContext in Spark-Shell fails

2016-02-15 Thread Gavin Yue
This sqlContext is one instance of hive context, do not be confused by the name. > On Feb 16, 2016, at 12:51, Prabhu Joseph wrote: > > Hi All, > > On creating HiveContext in spark-shell, fails with > > Caused by: ERROR XSDB6: Another instance of Derby may ha

Re: Creating HiveContext in Spark-Shell fails

2016-02-15 Thread Prabhu Joseph
ext] > > res0: Boolean = true > > > > On Mon, Feb 15, 2016 at 8:51 PM, Prabhu Joseph > wrote: > >> Hi All, >> >> On creating HiveContext in spark-shell, fails with >> >> Caused by: ERROR XSDB6: Another instance of Derby may have already

Re: Creating HiveContext in Spark-Shell fails

2016-02-15 Thread Mark Hamstra
Type :help for more information. scala> sqlContext.isInstanceOf[org.apache.spark.sql.hive.HiveContext] res0: Boolean = true On Mon, Feb 15, 2016 at 8:51 PM, Prabhu Joseph wrote: > Hi All, > > On creating HiveContext in spark-shell, fails with > > Caused by: ERROR XSDB6: An

Creating HiveContext in Spark-Shell fails

2016-02-15 Thread Prabhu Joseph
Hi All, On creating HiveContext in spark-shell, fails with Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /SPARK/metastore_db. Spark-Shell already has created metastore_db for SqlContext. Spark context available as sc. SQL context available as sqlContext

Re: Sharing HiveContext in Spark JobServer / getOrCreate

2016-01-25 Thread Deenar Toraskar
On 25 January 2016 at 21:09, Deenar Toraskar < deenar.toras...@thinkreactive.co.uk> wrote: > No I hadn't. This is useful, but in some cases we do want to share the > same temporary tables between jobs so really wanted a getOrCreate > equivalent on HIveContext. > > Deenar > > > > On 25 January 2016

Re: Sharing HiveContext in Spark JobServer / getOrCreate

2016-01-25 Thread Ted Yu
Have you noticed the following method of HiveContext ? * Returns a new HiveContext as new session, which will have separated SQLConf, UDF/UDAF, * temporary tables and SessionState, but sharing the same CacheManager, IsolatedClientLoader * and Hive client (both of execution and metadata) w

Sharing HiveContext in Spark JobServer / getOrCreate

2016-01-25 Thread Deenar Toraskar
Hi I am using a shared sparkContext for all of my Spark jobs. Some of the jobs use HiveContext, but there isn't a getOrCreate method on HiveContext which will allow reuse of an existing HiveContext. Such a method exists on SQLContext only (def getOrCreate(sparkContext: SparkContext): SQLContext).

Re: Unable to use HiveContext in spark-shell

2014-11-06 Thread tridib
I built spark-1.1.0 in a new fresh machine. This issue is gone! Thank you all for your help. Thanks & Regards Tridib -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-use-HiveContext-in-spark-shell-tp18261p18324.html Sent from the Apache Spark

Re: Unable to use HiveContext in spark-shell

2014-11-06 Thread tridib
Yes. I have org.apache.hadoop.hive package in spark assembly. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-use-HiveContext-in-spark-shell-tp18261p18322.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Unable to use HiveContext in spark-shell

2014-11-06 Thread Terry Siu
ect: RE: Unable to use HiveContext in spark-shell I am using spark 1.1.0. I built it using: ./make-distribution.sh -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -DskipTests My ultimate goal is to execute a query on parquet file with nested structure and cast a date string to Date. This i

RE: Unable to use HiveContext in spark-shell

2014-11-06 Thread Tridib Samanta
ilable. It may be completely missing from the current classpath, or the version on the classpath might be incompatible with the version used when compiling HiveContext.class. That entry seems to have slain the compiler. Shall I replay your session? I can re-run each line except the last

Re: Unable to use HiveContext in spark-shell

2014-11-06 Thread Terry Siu
What version of Spark are you using? Did you compile your Spark version and if so, what compile options did you use? On 11/6/14, 9:22 AM, "tridib" wrote: >Help please! > > > >-- >View this message in context: >http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-use-HiveCont >ext-in-spa

Re: Unable to use HiveContext in spark-shell

2014-11-06 Thread Jimmy McErlain
.* *E*: ji...@sellpoints.com *M*: *510.303.7751* On Thu, Nov 6, 2014 at 9:22 AM, tridib wrote: > Help please! > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-use-HiveContext-in-spark-shell-tp18261p18280.html > Se

Re: Unable to use HiveContext in spark-shell

2014-11-06 Thread tridib
Help please! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-use-HiveContext-in-spark-shell-tp18261p18280.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Unable to use HiveContext in spark-shell

2014-11-05 Thread tridib
slain the compiler. Shall I replay your session? I can re-run each line except the last one. [y/n] Thanks Tridib -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-use-HiveContext-in-spark-shell-tp18261.html Sent from the Apache Spark User L

Re: Got error “"java.lang.IllegalAccessError" when using HiveContext in Spark shell on AWS

2014-08-07 Thread Zhun Shen
Hi Cheng, I replaced Guava 15.0 with Guava 14.0.1 in my spark classpath, the problem was solved. So your method is correct. It proved that this issue was caused by AWS EMR (ami-version 3.1.0) libs which include Guava 15.0. Many thanks and see you in the first Spark User Beijing Meetup tomorrow.

Re: Got error “"java.lang.IllegalAccessError" when using HiveContext in Spark shell on AWS

2014-08-07 Thread Cheng Lian
Hey Zhun, Thanks for the detailed problem description. Please see my comments inlined below. On Thu, Aug 7, 2014 at 6:18 PM, Zhun Shen wrote: Caused by: java.lang.IllegalAccessError: tried to access method > com.google.common.collect.MapMaker.makeComputingMap(Lcom/google/common/base/Function;)L

Got error “"java.lang.IllegalAccessError" when using HiveContext in Spark shell on AWS

2014-08-07 Thread Zhun Shen
Hi, When I try to use HiveContext in Spark shell on AWS, I got the error "java.lang.IllegalAccessError: tried to access method com.google.common.collect.MapMaker.makeComputingMap(Lcom/google/common/base/Function;)Ljava/util/concurrent/ConcurrentMap". I follow the steps below to c