t;>
>> *From:* Xin Wu [mailto:xwu0...@gmail.com]
>> *Sent:* 13 janvier 2017 12:43
>> *To:* Nicolas Tallineau
>> *Cc:* user@spark.apache.org
>> *Subject:* Re: [Spark SQL - Scala] TestHive not working in Spark 2
>>
>>
>>
>> I used the following:
&
ier 2017 12:43
> *To:* Nicolas Tallineau
> *Cc:* user@spark.apache.org
> *Subject:* Re: [Spark SQL - Scala] TestHive not working in Spark 2
>
>
>
> I used the following:
>
>
> val testHive = new org.apache.spark.sql.hive.test.TestHiveContext(sc,
> *false*)
>
> val
nvier 2017 12:43
To: Nicolas Tallineau
Cc: user@spark.apache.org
Subject: Re: [Spark SQL - Scala] TestHive not working in Spark 2
I used the following:
val testHive = new org.apache.spark.sql.hive.test.TestHiveContext(sc, false)
val hiveClient = testHive.sessionState.metadataHive
hiveClient.
I used the following:
val testHive = new org.apache.spark.sql.hive.test.TestHiveContext(sc,
*false*)
val hiveClient = testHive.sessionState.metadataHive
hiveClient.runSqlHive(“….”)
On Fri, Jan 13, 2017 at 6:40 AM, Nicolas Tallineau <
nicolas.tallin...@ubisoft.com> wrote:
> I get a nullPointerE
I get a nullPointerException as soon as I try to execute a TestHive.sql(...)
statement since migrating to Spark 2 because it's trying to load non existing
"test tables". I couldn't find a way to switch to false the loadTestTables
variable.
Caused by: sbt.ForkMain$ForkError: java.lang.NullPointe