(WT01 - BAS)
Cc: user
Subject: Re: Running in cluster mode causes native library linking to fail
Hello guys,
After lots of time trying to make things work, I finally found what was causing
the issue:
I was calling the function from the library inside a map function, which caused
the code inside it
did not reply to the group in my original reply.
>
>
>
> *From:* Prajod S Vettiyattil (WT01 - BAS)
> *Sent:* 15 October 2015 11:45
> *To:* 'Bernardo Vecchia Stein'
> *Subject:* RE: Running in cluster mode causes native library linking to
> fail
>
>
ajod
From: Bernardo Vecchia Stein [mailto:bernardovst...@gmail.com]
Sent: 15 October 2015 00:36
To: Prajod S Vettiyattil (WT01 - BAS)
Subject: Re: Running in cluster mode causes native library linking to fail
Hello Prajod,
Thanks for your reply! I am also using the standalone cluster manager. I do not
Hi Renato,
I am using a single master and a single worker node, both in the same
machine, to simplify everything. I have tested with System.loadLibrary() as
well (setting all the necessary paths) and get the same error. Just double
checked everything and the parameters are fine.
Bernardo
On 14 O
Hi Bernardo,
So is this in distributed mode? or single node? Maybe fix the issue with a
single node first ;)
You are right that Spark finds the library but not the *.so file. I also
use System.load() with LD_LIBRARY_PATH set, and I am able to
execute without issues. Maybe you'd like to double chec
Sorry Bernardo, I just double checked. I use: System.loadLibrary();
Could you also try that?
Renato M.
2015-10-14 21:51 GMT+02:00 Renato Marroquín Mogrovejo <
renatoj.marroq...@gmail.com>:
> Hi Bernardo,
>
> So is this in distributed mode? or single node? Maybe fix the issue with a
> single
Hi Renato,
I have done that as well, but so far no luck. I believe spark is finding
the library correctly, otherwise the error message would be "no libraryname
found" or something like that. The problem seems to be something else, and
I'm not sure how to find it.
Thanks,
Bernardo
On 14 October 2
You can also try setting the env variable LD_LIBRARY_PATH to point where
your compiled libraries are.
Renato M.
2015-10-14 21:07 GMT+02:00 Bernardo Vecchia Stein
:
> Hi Deenar,
>
> Yes, the native library is installed on all machines of the cluster. I
> tried a simpler approach by just using S
Hi Deenar,
Yes, the native library is installed on all machines of the cluster. I
tried a simpler approach by just using System.load() and passing the exact
path of the library, and things still won't work (I get exactly the same
error and message).
Any ideas of what might be failing?
Thank you,
Hi Bernardo
Is the native library installed on all machines of your cluster and are you
setting both the spark.driver.extraLibraryPath and
spark.executor.extraLibraryPath ?
Deenar
On 14 October 2015 at 05:44, Bernardo Vecchia Stein <
bernardovst...@gmail.com> wrote:
> Hello,
>
> I am trying t
Hello,
I am trying to run some scala code in cluster mode using spark-submit. This
code uses addLibrary to link with a .so that exists in the machine, and
this library has a function to be called natively (there's a native
definition as needed in the code).
The problem I'm facing is: whenever I t
11 matches
Mail list logo