Is it possible to just downgrade spark ?

shyla deshpande <deshpandesh...@gmail.com> 于2018年12月24日周一 下午2:23写道:

> Hi Jeff,
> I am using emr-5.20.0 which comes with applications:Spark 2.4.0 and
> Zeppelin 0.8.0.
> Are you suggesting I downgrade the emr version?
> Thanks
> -Shyla
>
>
> On Sun, Dec 23, 2018 at 3:40 PM Jeff Zhang <zjf...@gmail.com> wrote:
>
>>
>> I see the scope of scala-reflect in pom.xml is runtime. Could you also
>> try the apache zeppelin version instead of AWS version ? I guess it may be
>> a bug of apache zeppelin
>>
>> https://github.com/apache/zeppelin/blob/master/cassandra/pom.xml#L109
>>
>> shyla deshpande <deshpandesh...@gmail.com> 于2018年12月24日周一 上午3:29写道:
>>
>>> Hi Jeff,
>>> Thank you for your response.  My cluster is on AWS EMR.  Since it is all
>>> preconfigured, I was assuming there won't be any version mismatch.
>>> Can you please give me more info as to where I should be looking at.
>>>
>>> Thanks
>>> -Shyla
>>>
>>> On Sat, Dec 22, 2018 at 10:49 PM Jeff Zhang <zjf...@gmail.com> wrote:
>>>
>>>> Do you binary version or building it from source ? It seems due to
>>>> scala version issue.
>>>>
>>>> shyla deshpande <deshpandesh...@gmail.com> 于2018年12月23日周日 下午2:08写道:
>>>>
>>>>> Cassandra Interpreter on emr-5.20.0 gives error
>>>>> [image: CassandraInterpreterError.png]
>>>>> Thank
>>>>> -Shyla
>>>>>
>>>>>
>>>>
>>>> --
>>>> Best Regards
>>>>
>>>> Jeff Zhang
>>>>
>>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>

-- 
Best Regards

Jeff Zhang

Reply via email to