Hi,

In case of Spark 1.3, you need to deploy python libraries from Spark by
yourself, or add a path for existing path on every nodes.

Regards,
JL

On Sat, Jul 25, 2015 at 2:54 AM, Vadla, Karthik <[email protected]>
wrote:

>  Sounds convincing, Let me give a try.
>
>
>
> Thanks
>
> Karthik
>
>
>
> *From:* IT CTO [mailto:[email protected]]
> *Sent:* Friday, July 24, 2015 10:53 AM
> *To:* [email protected]
> *Subject:* Re: Pyspark is not responding & Hive connection error
>
>
>
> I don't know if this is the cause of your problem but if you want to move
> zeppelin to another machine you should build zeppelin with the -P
> build-distr option and you will find a zip\tar in the distribute directory
> which is ready to be deployed on anther machine.
>
> Eran
>
>
>
> On Fri, Jul 24, 2015 at 8:46 PM Vadla, Karthik <[email protected]>
> wrote:
>
>  Hello All,
>
>
>
> I have built my binaries on one of the clusters. With pyspark profile.
>
> *mvn clean package -Pspark-1.3 -Ppyspark -Dhadoop.version=2.6.0-cdh5.4.2
> -Phadoop-2.6 –DskipTests*
>
> Everything was working fine when I tested on same machine.
>
>
>
> Now I just copied same binaries to other machine and started running
> zeppelin.
>
>
>
> It gives me below errors.
>
>
>
> For pyspark:
>
>
>
> And for %hive
>
>
>
> Can anyone help me with this. What is causing the issue?
>
>
>
> Thanks
>
> Karthik
>
>


-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net

Reply via email to