Putting that gave me this - <console>:35: error: object hive is not a member of package org.apache.spark.sql val hc = new org.apache.spark.sql.hive.HiveContext(sc)
I would have thought the dependencies would have been registered/included with Zeppelin Regards Beh On Tue, Aug 9, 2016 at 9:05 AM, Mohit Jaggi <mohitja...@gmail.com> wrote: > You can create a hive context explicitly as follows: > > val hc = new org.apache.spark.sql.hive.HiveContext(sc) > > But this will most likely not work for %sql as that will use the internal > context which for you appears to be the non Hive version > > If the config param you are using is not taking effect, try restarting the > spark interpreter. > > > Mohit > www.dataorchardllc.com > > On Aug 8, 2016, at 1:49 PM, Teik Hooi Beh <th...@thbeh.com> wrote: > > Hi, > > I have set zeppelin.sparkuseHiveContext to 'true' in spark interpreter but > yet when I try to use saveAsTable, I got the following - > > *java.lang.RuntimeException: Tables created with SQLContext must be > TEMPORARY. Use a HiveContext instead.* > > > In my notebook I am using sqlContext or is there another way to use > HiveContext? > > Thanks > > Regards > Beh > > >