Yes I build the same way as you suggested but no luck.

Regards,
Sachin Janani

On Tue, Jun 19, 2018 at 7:13 PM, Sahil Takiar <takiar.sa...@gmail.com>
wrote:

> You should be building Spark without Hive. For Spark 2.3.0, the command is:
>
> ./dev/make-distribution.sh --name "hadoop2-without-hive" --tgz
> "-Pyarn,hadoop-provided,hadoop-2.7,parquet-provided,orc-provided
> <https://cwiki.apache.org/confluence/display/Hive/hadoop-2.7,parquet-provided,orc-provided>
> "
>
> If you check the distribution after running the command, it shouldn't
> contain any Hive jars.
>
> On Tue, Jun 19, 2018 at 7:18 AM, Sachin janani <sachin.janani...@gmail.com
> > wrote:
>
>> It shows following exception :
>>
>>
>>
>>
>> *java.lang.NoSuchFieldError: HIVE_STATS_JDBC_TIMEOUTat
>> org.apache.spark.sql.hive.HiveUtils$.formatTimeVarsForHiveClient(HiveUtils.scala:205)at
>> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)at
>> org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)*
>> *at
>> org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)*
>>
>>
>> After looking at jira SPARK-13446
>> <https://issues.apache.org/jira/browse/SPARK-13446> it seems that it is
>> fixed but as per the source code it is not. So I resolved by changing spark
>> code and rebuilding the spark binaries again but now it shows new error
>> NoSuchMethodError. As per my preliminary investigation it seems that Spark
>> is build with Hive 1.2.1 which is causing this issues. Can you please let
>> me know if i am missing anything?
>>
>>
>> Regards,
>> Sachin Janani
>>
>> On Tue, Jun 19, 2018 at 5:38 PM, Sahil Takiar <takiar.sa...@gmail.com>
>> wrote:
>>
>>> I updated the doc to reflect that Hive 3.0.0 works with Spark 2.3.0.
>>> What issues are you seeing?
>>>
>>> On Tue, Jun 19, 2018 at 7:03 AM, Sachin janani <
>>> sachin.janani...@gmail.com> wrote:
>>>
>>>> This is the same link which I followed. As per this link for
>>>> spark-2.3.0 we need to use hive master instead of hive 3.0.0. Also we
>>>> need to custom build spark without hive dependencies but after trying
>>>> all this it shows some compatibility issues.
>>>>
>>>>
>>>> Regards,
>>>> Sachin Janani
>>>>
>>>> On Tue, Jun 19, 2018 at 5:02 PM, Sahil Takiar <takiar.sa...@gmail.com>
>>>> wrote:
>>>> > Yes, Hive 3.0.0 works with Spark 2.3.0 - this section of the wiki has
>>>> > details on which Hive releases support which Spark versions.
>>>> >
>>>> > On Tue, Jun 19, 2018 at 5:59 AM, Sachin janani <
>>>> sachin.janani...@gmail.com>
>>>> > wrote:
>>>> >>
>>>> >> Hi,
>>>> >> I am trying to run hive on spark by following the steps mentioned
>>>> >> here-
>>>> >> https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spa
>>>> rk%3A+Getting+Started
>>>> >> , but getting many compatibility issues like NoSuchMethodError,
>>>> >> NoSuchFieldException etc. So just need to know if it works and
>>>> whether
>>>> >> someone tried it out,
>>>> >>
>>>> >>
>>>> >> Thanks and Regards,
>>>> >> --
>>>> >> Sachin Janani
>>>> >
>>>> >
>>>> >
>>>> >
>>>> > --
>>>> > Sahil Takiar
>>>> > Software Engineer
>>>> > takiar.sa...@gmail.com | (510) 673-0309
>>>>
>>>>
>>>>
>>>> --
>>>> Sachin Janani
>>>>
>>>
>>>
>>>
>>> --
>>> Sahil Takiar
>>> Software Engineer
>>> takiar.sa...@gmail.com | (510) 673-0309
>>>
>>
>>
>>
>> --
>> *Sachin Janani*
>>
>>
>
>
>
> --
> Sahil Takiar
> Software Engineer
> takiar.sa...@gmail.com | (510) 673-0309
>



-- 
*Sachin Janani*

Reply via email to