There is a duplicate.  This functionality is planned to be taken out so there 
will no longer be a duplicate.  If the SPARK_HOME environment variable is set, 
the Spark in the Zeppelin home folder will be ignored.

In other words, don’t worry about-Ppyspark, it doesn’t do anything relevant to 
you. 



From: Hoc Phan <quang...@yahoo.com>
Reply: users@zeppelin.incubator.apache.org 
<users@zeppelin.incubator.apache.org>, Hoc Phan <quang...@yahoo.com>
Date: December 26, 2015 at 11:25:37 PM
To: users@zeppelin.incubator.apache.org <users@zeppelin.incubator.apache.org>
Subject:  Re: What is -Ppyspark for?  

What do you mean "external" spark? Does it mean spark folder in the same 
machine already installed? 
Can you confirm if Zeppelin actually downloads and put its own spark in 
zeppelin home folder? So there is potential duplicate? 


On Saturday, December 26, 2015 10:21 PM, Jongyoul Lee <jongy...@gmail.com> 
wrote:


According to bin/interpreter.sh, it's not necessary if you use external spark. 
But without external spark, you should build Zeppelin with -Ppyspark if you 
want to use pyspark.

On Sun, Dec 27, 2015 at 10:15 AM, Amos B. Elberg <amos.elb...@me.com> wrote:
I just looked at this, and I’m confused about it too. 

It looks like this shouldn’t be necessary when we get rid of launching spark 
from a subdirectory of ZEPPELIN_HOME. 

It also looks like if -Ppyspark is specified, the installer downloads a second 
copy of the Spark binaries, but that’s it.    

Is any of this actually necessary?  It looks to me that -Ppyspark could come 
out of the build completely.  



From: Hoc Phan <quang...@yahoo.com>
Reply: users@zeppelin.incubator.apache.org 
<users@zeppelin.incubator.apache.org>, Hoc Phan <quang...@yahoo.com>
Date: December 26, 2015 at 7:21:21 PM
To: users@zeppelin.incubator.apache.org <users@zeppelin.incubator.apache.org>
Subject:  Re: What is -Ppyspark for?

Hi

Thanks for the response but I am still vague about this option. So when do we 
NOT use -Ppyspark? I guess I am confused about this statement "Instead of 
configuring pyspark manually.." in the old thread. Does it referring setting 
SPARK_HOME and PYSPARK env variables? 


On Thursday, December 24, 2015 3:18 AM, Renjith Kamath 
<renjith.kam...@gmail.com> wrote:


Hi Hoc,

Please have a look at the following related mail thread.
"Instead of configuring pyspark manually. To enable pyspark configuration 
during build, just provide profile name “Ppyspark” while building your zeppelin 
binaries."
http://apache-zeppelin-users-incubating-mailing-list.75479.x6.nabble.com/Zeppelin-on-CDH-td284.html#a285


On Thu, Dec 24, 2015 at 2:03 PM, Hoc Phan <quang...@yahoo.com> wrote:
When I build Zeppelin, when do I use -Ppyspark?

Does it mean Zeppelin will use pyspark instead of spark-submit? If so, what's 
the difference?






--
이종열, Jongyoul Lee, 李宗烈
http://madeng.net


Reply via email to