This is the same link which I followed. As per this link for
spark-2.3.0 we need to use hive master instead of hive 3.0.0. Also we
need to custom build spark without hive dependencies but after trying
all this it shows some compatibility issues.


Regards,
Sachin Janani

On Tue, Jun 19, 2018 at 5:02 PM, Sahil Takiar <takiar.sa...@gmail.com> wrote:
> Yes, Hive 3.0.0 works with Spark 2.3.0 - this section of the wiki has
> details on which Hive releases support which Spark versions.
>
> On Tue, Jun 19, 2018 at 5:59 AM, Sachin janani <sachin.janani...@gmail.com>
> wrote:
>>
>> Hi,
>> I am trying to run hive on spark by following the steps mentioned
>> here-
>> https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started
>> , but getting many compatibility issues like NoSuchMethodError,
>> NoSuchFieldException etc. So just need to know if it works and whether
>> someone tried it out,
>>
>>
>> Thanks and Regards,
>> --
>> Sachin Janani
>
>
>
>
> --
> Sahil Takiar
> Software Engineer
> takiar.sa...@gmail.com | (510) 673-0309



-- 
Sachin Janani

Reply via email to