Do you have a snippets showing how to do this. I'm relatively new to spark
and scala and for now, my code is just a single file inspired  from spark
example :

object SparkOpencv {
  def main(args: Array[String]) {
    val conf = new SparkConf()
             .setMaster("local[8]")
             .setAppName("SparkPi")
             .set("spark.executor.memory", "1g")
             .set("SPARK_LIBRARY_PATH", "/home/opencv/build/lib")
             .set("SPARK_PRINT_LAUNCH_COMMAND","1")
             .set("SPARK_CLASSPATH","/home/opencv/build/bin/opencv-300.jar")

    val spark = new SparkContext(conf)
    spark.addJar("/home/jrabarisoa/github/opencv/build/bin/opencv-300.jar")

    System.loadLibrary(Core.NATIVE_LIBRARY_NAME)




On Tue, Mar 11, 2014 at 8:05 PM, Matei Zaharia <matei.zaha...@gmail.com>wrote:

> In short you should add it to a static initializer or singleton object
> that you call before accessing your library.
>
> Also add your library to SPARK_LIBRARY_PATH so it can find the .so / .dll.
>
> Matei
>
> On Mar 11, 2014, at 7:05 AM, Debasish Das <debasish.da...@gmail.com>
> wrote:
>
> Look at jblas operations inside mllib...jblas calls jniloader internally
> which loadd up native code when available....
>  On Mar 11, 2014 4:07 AM, "Jaonary Rabarisoa" <jaon...@gmail.com> wrote:
>
>> Hi all,
>>
>> I'm trying to build a stand alone scala spark application that uses
>> opencv for image processing.
>> To get opencv works with scala one need to call
>>
>>   System.loadLibrary(Core.NATIVE_LIBRARY_NAME)
>>
>> once per JVM process. How to call it inside spark application distributed
>> on several nodes ?
>>
>> Best regards,
>>
>> Jaonary
>>
>
>

Reply via email to