Hi,
Correct me if I am wrong. Looking at [1], I believe the error will log
during restart interpreter. So in my case I did not see the error in the
log, so I would assume that it did not catch any exception during try
statement and successfully register hiveContext.
Regards
Beh
[1] https://githu
Hi,
set zeppelin.sparkuserHiveContext 'to' true supposed to create HiveContext.
If Zeppelin fails to create HiveContext for some reason, it prints logs and
fallback to SqlContext. [1]
If you can take a look 'logs/zeppelin-interpreter-spark-*' logs and find
"Can't create HiveContext. Fallback to S
Putting that gave me this -
:35: error: object hive is not a member of package
org.apache.spark.sql
val hc = new org.apache.spark.sql.hive.HiveContext(sc)
I would have thought the dependencies would have been registered/included
with Zeppelin
Regards
Beh
On Tue, Aug 9, 2016 at 9:05 AM, Mohit
You can create a hive context explicitly as follows:
val hc = new org.apache.spark.sql.hive.HiveContext(sc)
But this will most likely not work for %sql as that will use the internal
context which for you appears to be the non Hive version
If the config param you are using is not taking effect,
Hi,
I have set zeppelin.sparkuseHiveContext to 'true' in spark interpreter but
yet when I try to use saveAsTable, I got the following -
*java.lang.RuntimeException: Tables created with SQLContext must be
TEMPORARY. Use a HiveContext instead.*
In my notebook I am using sqlContext or is there ano