I did not build my own Spark. I got the binary version online. If it can
load the native libs from IDE, it should also be able to load native when
running with "--matter local".

On Mon, 23 Mar 2015 07:15 Burak Yavuz <brk...@gmail.com> wrote:

> Did you build Spark with: -Pnetlib-lgpl?
>
> Ref: https://spark.apache.org/docs/latest/mllib-guide.html
>
> Burak
>
> On Sun, Mar 22, 2015 at 7:37 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>
>> How about pointing LD_LIBRARY_PATH to native lib folder ?
>>
>> You need Spark 1.2.0 or higher for the above to work. See SPARK-1719
>>
>> Cheers
>>
>> On Sun, Mar 22, 2015 at 4:02 AM, Xi Shen <davidshe...@gmail.com> wrote:
>>
>>> Hi Ted,
>>>
>>> I have tried to invoke the command from both cygwin environment and
>>> powershell environment. I still get the messages:
>>>
>>> 15/03/22 21:56:00 WARN netlib.BLAS: Failed to load implementation from:
>>> com.github.fommil.netlib.NativeSystemBLAS
>>> 15/03/22 21:56:00 WARN netlib.BLAS: Failed to load implementation from:
>>> com.github.fommil.netlib.NativeRefBLAS
>>>
>>> From the Spark UI, I can see:
>>>
>>>   spark.driver.extraLibrary c:\openblas
>>>
>>>
>>> Thanks,
>>> David
>>>
>>>
>>> On Sun, Mar 22, 2015 at 11:45 AM Ted Yu <yuzhih...@gmail.com> wrote:
>>>
>>>> Can you try the --driver-library-path option ?
>>>>
>>>> spark-submit --driver-library-path /opt/hadoop/lib/native ...
>>>>
>>>> Cheers
>>>>
>>>> On Sat, Mar 21, 2015 at 4:58 PM, Xi Shen <davidshe...@gmail.com> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> I use the *OpenBLAS* DLL, and have configured my application to work
>>>>> in IDE. When I start my Spark application from IntelliJ IDE, I can see in
>>>>> the log that the native lib is loaded successfully.
>>>>>
>>>>> But if I use *spark-submit* to start my application, the native lib
>>>>> still cannot be load. I saw the WARN message that it failed to load both
>>>>> the native and native-ref library. I checked the *Environment* tab in
>>>>> the Spark UI, and the *java.library.path* is set correctly.
>>>>>
>>>>>
>>>>> Thanks,
>>>>>
>>>>> David
>>>>>
>>>>>
>>>>>
>>>>
>>
>

Reply via email to