Ok, thanks a lot for the info. Will try installing Hive and make it work.
Will reach out to you if any roadblocks are encountered.
On Tue, Apr 11, 2023 at 1:43 PM Jeff Zhang wrote:
> For local, if you don't have hive installed, then you can not use
> hiveContext
>
> On Tue, Apr 11, 2023 at 4:00
For local, if you don't have hive installed, then you can not use
hiveContext
On Tue, Apr 11, 2023 at 4:00 PM VIVEK NARAYANASETTY
wrote:
> Hi Jeff,
>
> This is in my local system and I am using Zeppelin docker image and
> passing the SPARK_HOME directory as below.
>
> -v C:\spark-3.1.3-bin-hadoo
Hi Jeff,
This is in my local system and I am using Zeppelin docker image and passing
the SPARK_HOME directory as below.
-v C:\spark-3.1.3-bin-hadoop3.2:/opt/spark -e SPARK_HOME=/opt/spark
On Tue, Apr 11, 2023 at 12:22 PM Jeff Zhang wrote:
> Ask your hadoop cluster admin for the hive-site.xml y
Ask your hadoop cluster admin for the hive-site.xml you should use.
On Tue, Apr 11, 2023 at 2:19 PM VIVEK NARAYANASETTY
wrote:
> Hi Jeff,
>
> Thanks a lot for responding. I have placed the hive-site.xml file in the
> spark/conf folder now but now encountering a new error.
>
> org.apache.hadoop.h
Hi Jeff,
Thanks a lot for responding. I have placed the hive-site.xml file in the
spark/conf folder now but now encountering a new error.
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClie
Have you put hive-site.xml under SPARK_CONF_DIR?
On Tue, Apr 11, 2023 at 1:01 PM VIVEK NARAYANASETTY
wrote:
> Hi Users,
>
> Appreciate if any leads on the below issue.
>
> On Sat, Apr 8, 2023 at 1:33 PM VIVEK NARAYANASETTY
> wrote:
>
>> Hello Everyone,
>>
>> I am trying to create some tables u
Hi Users,
Appreciate if any leads on the below issue.
On Sat, Apr 8, 2023 at 1:33 PM VIVEK NARAYANASETTY
wrote:
> Hello Everyone,
>
> I am trying to create some tables using Spark SQL and encountering the
> below error in zeppelin. When debugged, I could see that the
> "zeppelin.spark.useHiveCo
Hello Everyone,
I am trying to create some tables using Spark SQL and encountering the
below error in zeppelin. When debugged, I could see that the
"zeppelin.spark.useHiveContext" is set to true which means hive context
needs to be used in place of SQLContext but doesn't look its able to
use/initi