Hi All,
Thanks for the help< I will try it out.
Just one comment, from your mail it looks like Z does not support CDH but
from the Z build page on Github you have :
CDH 5.X
mvn clean package -Pspark-1.2 -Dhadoop.version=2.5.0-cdh5.3.0 -Phadoop-2.4
-DskipTests

So am I missing something?

Eran

On Wed, Mar 25, 2015 at 3:40 AM, Kevin (Sangwoo) Kim <[email protected]>
wrote:

> Supporting CDH or custom-built Spark support can be achieved by providing
> Spark assembly jar or spark home to Zeppelin (in the future..)
>
> But I guess still we need to build spark inside of Zeppelin for local
> mode, we're seeking nice way to integrate two. (supplying spark jar and
> build spark inside of Z)
>
> For now if you got trouble with Spark build, you can delete all
> spark-related dependencies in zeppelin/interpreter/spark
> like rm zeppelin/interpreter/spark/*spark*.jar
>
> and copy spark-assembly-XXX.jar into the zeppelin/interpreter/spark/
>
> Regards,
> Kevin
>
> On Wed, Mar 25, 2015 at 10:33 AM Jongyoul Lee <[email protected]> wrote:
>
>> Hi,
>>
>> CDH Spark has a different version of original spark. Although I don't
>> know if Zeppelin support CDH-spark or not, because I don't test Zeppelin
>> with CDH-distributed Spark, you can try to change spark.version to
>> 1.2.0-cdh5.3.0. "-Dspark.version=1.2.0-cdh5.3.0".
>>
>> @Moon,
>>
>> Do you think we try to dig into support of CDH?
>>
>> Regards,
>> Jongyoul Lee
>>
>> On Wed, Mar 25, 2015 at 1:32 AM, IT CTO <[email protected]> wrote:
>>
>>> Yes, I am using CDH 5.3 spark service. we did not install spark
>>> separately.
>>>
>>> I packaged zeppelin using this command:
>>>
>>> mvn clean package -Pspark-1.2 -Dhadoop.version=2.5.0-cdh5.3.0 -Phadoop-2.4 
>>> -DskipTests
>>>
>>> and running on the internal server we have.
>>> if I don't change the master in the config file or if I point to the 
>>> spark://yourserver:7077,
>>> both cases I get an error about 
>>> "org.apache.spark.sql.catalyst.scalareflection
>>> not found" - is that a spark 1.2 issue? classpath issue?
>>>
>>>
>>> Eran
>>>
>>> On Tue, Mar 24, 2015 at 5:10 PM, Jongyoul Lee <[email protected]>
>>> wrote:
>>>
>>>> Do you use a spark as a CDH version? Vanilla spark and CDH's one
>>>> doesn't match some packages' version. spark://yourserver:7077 is correct.
>>>>
>>>> On Tue, Mar 24, 2015 at 11:11 PM, IT CTO <[email protected]> wrote:
>>>>
>>>>> OK, if I edit the JSON file I get the right values.
>>>>> What should be the master value for it to use the external service?
>>>>>
>>>>> I tried sprak://myserver:7077 but I don't see any activity on my spark
>>>>> server monitor also failing code with some scala reflection error
>>>>> Eran
>>>>>
>>>>> On Tue, Mar 24, 2015 at 3:58 PM, Jongyoul Lee <[email protected]>
>>>>> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> Could you check you conf/interpreter.json? This is the configuration
>>>>>> file which shows in your interpreter tab. When I edit and save it, It's
>>>>>> affected permanently in my case.
>>>>>>
>>>>>> JL
>>>>>>
>>>>>> On Tue, Mar 24, 2015 at 10:52 PM, IT CTO <[email protected]> wrote:
>>>>>>
>>>>>>> I am having the same problem... using CDH 5.3 with spark service
>>>>>>> running. I have no problem running spark-shell and executing jobs.
>>>>>>> But I can't get zeppelin to use spark on my CDH. BTW, I am using the
>>>>>>> packaged zeppelin (e.g. build with mvn clean packge with the CDH 5.3
>>>>>>> parameters)
>>>>>>> The interpreter tab keep showing master = local (*)
>>>>>>> Eran
>>>>>>>
>>>>>>> On Tue, Mar 24, 2015 at 3:00 PM, RJ Nowling <[email protected]>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hi Babeena,
>>>>>>>>
>>>>>>>> Have you been able to use the spark shell command line? This looks
>>>>>>>> more like an issue with your Spark issue than with Zeppelin.
>>>>>>>>
>>>>>>>> RJ
>>>>>>>>
>>>>>>>>
>>>>>>>> On Mar 24, 2015, at 12:30 AM, <[email protected]> <
>>>>>>>> [email protected]> wrote:
>>>>>>>>
>>>>>>>>  Hi all,
>>>>>>>>
>>>>>>>>
>>>>>>>>  I wanted to connect zeppelin with external spark, Is there any
>>>>>>>> documentation regarding this,
>>>>>>>>
>>>>>>>>  I have changed the intrepretter master to the master url of
>>>>>>>> external spark
>>>>>>>>
>>>>>>>>
>>>>>>>>  for local works fine. with external am getting some error : 'Job
>>>>>>>> aborted due to stage failure: Task 0 in stage 1.0 failed 4 times, most
>>>>>>>> recent failure: Lost task 0.3 in stage 1.0 (TID 7, harmonix):
>>>>>>>> ExecutorLostFailure (executor lost) Driver stacktrace:​' ​
>>>>>>>>
>>>>>>>>
>>>>>>>>  Thanks,
>>>>>>>>
>>>>>>>> Babeena
>>>>>>>>  The information contained in this electronic message and any
>>>>>>>> attachments to this message are intended for the exclusive use of the
>>>>>>>> addressee(s) and may contain proprietary, confidential or privileged
>>>>>>>> information. If you are not the intended recipient, you should not
>>>>>>>> disseminate, distribute or copy this e-mail. Please notify the sender
>>>>>>>> immediately and destroy all copies of this message and any attachments.
>>>>>>>> WARNING: Computer viruses can be transmitted via email. The recipient
>>>>>>>> should check this email and any attachments for the presence of 
>>>>>>>> viruses.
>>>>>>>> The company accepts no liability for any damage caused by any virus
>>>>>>>> transmitted by this email. www.wipro.com
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> Eran | CTO
>>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> 이종열, Jongyoul Lee, 李宗烈
>>>>>> http://madeng.net
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Eran | CTO
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> 이종열, Jongyoul Lee, 李宗烈
>>>> http://madeng.net
>>>>
>>>
>>>
>>>
>>> --
>>> Eran | CTO
>>>
>>
>>
>>
>> --
>> 이종열, Jongyoul Lee, 李宗烈
>> http://madeng.net
>>
>


-- 
Eran | CTO

Reply via email to