Hi,

Correct me if I am wrong. Looking at [1], I believe the error will log
during restart interpreter. So in my case I did not see the error in the
log, so I would assume that it did not catch any exception during try
statement and successfully register hiveContext.

Regards
Beh

[1] https://github.com/apache/zeppelin/blob/v0.6.0/spark/
src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java#L197


On Tue, Aug 9, 2016 at 11:46 AM, moon soo Lee <m...@apache.org> wrote:

> Hi,
>
> set zeppelin.sparkuserHiveContext 'to' true supposed to create HiveContext.
> If Zeppelin fails to create HiveContext for some reason, it prints logs
> and fallback to SqlContext. [1]
>
> If you can take a look 'logs/zeppelin-interpreter-spark-*' logs and find
> "Can't create HiveContext. Fallback to SQLContext" message, then you might
> get some clue why HiveContext is not being created.
>
> Thanks,
> moon
>
> [1] https://github.com/apache/zeppelin/blob/v0.6.0/spark/
> src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java#L197
>
>
> On Mon, Aug 8, 2016 at 3:37 PM Teik Hooi Beh <th...@thbeh.com> wrote:
>
>> Putting that gave me this -
>>
>> <console>:35: error: object hive is not a member of package
>> org.apache.spark.sql
>>   val hc = new org.apache.spark.sql.hive.HiveContext(sc)
>>
>> I would have thought the dependencies would have been registered/included
>> with Zeppelin
>>
>> Regards
>> Beh
>>
>> On Tue, Aug 9, 2016 at 9:05 AM, Mohit Jaggi <mohitja...@gmail.com> wrote:
>>
>>> You can create a hive context explicitly as follows:
>>>
>>>  val hc = new org.apache.spark.sql.hive.HiveContext(sc)
>>>
>>> But this will most likely not work for %sql as that will use the
>>> internal context which for you appears to be the non Hive version
>>>
>>> If the config param you are using is not taking effect, try restarting
>>> the spark interpreter.
>>>
>>>
>>> Mohit
>>> www.dataorchardllc.com
>>>
>>> On Aug 8, 2016, at 1:49 PM, Teik Hooi Beh <th...@thbeh.com> wrote:
>>>
>>> Hi,
>>>
>>> I have set zeppelin.sparkuseHiveContext to 'true' in spark interpreter
>>> but yet when I try to use saveAsTable, I got the following -
>>>
>>> *java.lang.RuntimeException: Tables created with SQLContext must be
>>> TEMPORARY. Use a HiveContext instead.*
>>>
>>>
>>> In my notebook I am using sqlContext or is there another way to use
>>> HiveContext?
>>>
>>> Thanks
>>>
>>> Regards
>>> Beh
>>>
>>>
>>>
>>

Reply via email to