Hello,

I tried to use z.loadAndDist() but it says

console>:17: error: value loadAndDist is not a member of
org.apache.zeppelin.spark.dep.DependencyContext

Any idea here what this method is for?


regards
Bala

On 25 January 2016 at 15:34, Balachandar R.A. <balachandar...@gmail.com>
wrote:

> Hello,
>
> I have run the code in spark-shell successfully but the jar files were all
> specified in the config files (spark-defaults.conf). However, I will not be
> able to use z.load() in spark-shell. Isn't? I am sorry but I did not pick
> up the idea of running using spark-shell. Wail suggestion is to create a
> fatJar? I will give it a try but still how do i make sure this fatJar is
> accessible to spark executors? ANyway, I will keep you posted on this
>
> regards
> Bala
>
> On 25 January 2016 at 13:39, Hyung Sung Shim <hss...@nflabs.com> wrote:
>
>> Hello.
>> I think Wail Alkowaileet's comment is possible.
>> Balachandar, Could you try to run your application with spark-shell?
>>
>>
>> 2016-01-25 15:45 GMT+09:00 Wail Alkowaileet <wael....@gmail.com>:
>>
>>> I used z.load in my case and it seems to be working just fine.
>>> Can you try spark-shell with your jar file? and see what is the error?
>>>
>>> I assume the problem that your application requires third-party jars.
>>> Therefore, you need to build your app with 'assembly'.
>>>
>>>
>>> On Mon, Jan 25, 2016 at 9:39 AM, Balachandar R.A. <
>>> balachandar...@gmail.com> wrote:
>>>
>>>> Hello Hyung,
>>>>
>>>> There is nothig I could make out from error log as it is plain
>>>> straightforward that classNotFoundException
>>>>
>>>> On 25 January 2016 at 11:34, Hyung Sung Shim <hss...@nflabs.com> wrote:
>>>>
>>>>> It's weird..so Could you send the error log for details?
>>>>>
>>>>> 2016-01-25 15:00 GMT+09:00 Balachandar R.A. <balachandar...@gmail.com>
>>>>> :
>>>>>
>>>>>> Hi Hyung,
>>>>>>
>>>>>> Thanks for the response. This I have tried but did not work.
>>>>>>
>>>>>> regards
>>>>>> Bala
>>>>>>
>>>>>> On 25 January 2016 at 11:27, Hyung Sung Shim <hss...@nflabs.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hello. Balachandar.
>>>>>>> In case of third one that you've tried, It must be first executed in
>>>>>>> the notebook.
>>>>>>> Could you try restart the zeppelin and run first the "%dep z.load()"
>>>>>>> paragraph?
>>>>>>>
>>>>>>>
>>>>>>> 2016-01-25 14:39 GMT+09:00 Balachandar R.A. <
>>>>>>> balachandar...@gmail.com>:
>>>>>>>
>>>>>>>> Hi
>>>>>>>>
>>>>>>>> Any help would be greatly appreciated :-)
>>>>>>>>
>>>>>>>>
>>>>>>>> ---------- Forwarded message ----------
>>>>>>>> From: Balachandar R.A. <balachandar...@gmail.com>
>>>>>>>> Date: 21 January 2016 at 14:11
>>>>>>>> Subject: Providing third party jar files to spark
>>>>>>>> To: users@zeppelin.incubator.apache.org
>>>>>>>>
>>>>>>>>
>>>>>>>> Hello
>>>>>>>>
>>>>>>>> My spark based map tasks needs to access third party jar files. I
>>>>>>>> found below options to submit third party jar files to spark 
>>>>>>>> interpreter
>>>>>>>>
>>>>>>>> 1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma
>>>>>>>> seprated> in conf/zeppelin-env.sh
>>>>>>>>
>>>>>>>> 2. include the statement spark.jars  <all the jar files with comma
>>>>>>>> separated> in <spark>?conf/spark-defaults.conf
>>>>>>>>
>>>>>>>> 3. use the z.load("the location of jar file in the local
>>>>>>>> filesystem") in zepelin notebook
>>>>>>>>
>>>>>>>> I could test the first two and they both works fine. The third one
>>>>>>>> does not work. Here is the snippet i use
>>>>>>>>
>>>>>>>> %dep
>>>>>>>> z.reset()
>>>>>>>>
>>>>>>>> z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")
>>>>>>>>
>>>>>>>>
>>>>>>>> Further, the import of class belongs to the above jar file is
>>>>>>>> working when I use the statement import com.....  in zeppelin notebook.
>>>>>>>> However, I get the class not found exception in the executor for the 
>>>>>>>> same
>>>>>>>> class.
>>>>>>>>
>>>>>>>> Any clue here would help greatly
>>>>>>>>
>>>>>>>>
>>>>>>>> regards
>>>>>>>> Bala
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>>
>>> *Regards,*
>>> Wail Alkowaileet
>>>
>>
>>
>

Reply via email to