Which release of Spark are you using ?
Can you show the full error trace ?
Thanks
On Tue, Jun 14, 2016 at 6:33 PM, Tejaswini Buche <
tejaswini.buche0...@gmail.com> wrote:
> I am trying to use hivecontext in spark. The following statements are
> running fine :
>
> from pyspark.sql import HiveCon
I am trying to use hivecontext in spark. The following statements are
running fine :
from pyspark.sql import HiveContext
sqlContext = HiveContext(sc)
But, when i run the below statement,
sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")
I get the following error :
Java P
Hi Mich,
thanks a ton for your kind response, but this error was happening because
of loading derby classes mroe than once
In my second email I mentioned the steps that I took in order to resolve
the issue.
Thanks and Regards,
Gourav
On Tue, Mar 1, 2016 at 8:54 PM, Mich Talebzadeh
wrote:
> Hi
Hi,
FIRST ATTEMPT:
Use build.sbt in IntelliJ and it was giving me nightmares with several
incompatibility and library issues though the sbt version was compliant
with the scala version
SECOND ATTEMPT:
Created a new project with no entries in build.sbt file and imported all
the files in $SPARK_HOM
Hi,
I am getting the error "*java.lang.SecurityException: sealing violation:
can't seal package org.apache.derby.impl.services.locks: already loaded"*
after running the following code in SCALA.
I do not have any other instances of sparkContext running from my system.
I will be grateful for if a
Hello,
I am trying to define an external Hive table from Spark HiveContext like the
following:
import org.apache.spark.sql.hive.HiveContext
val hiveCtx = new HiveContext(sc)
hiveCtx.sql(s"""CREATE EXTERNAL TABLE IF NOT EXISTS Rentrak_Ratings (Version
string, Gen_Date string, Market_Number s