Hi,

There is a weird problem with spark when handling native dependency code:
I want to use a library (JAI) with spark to parse some spatial raster
files. Unfortunately, there are some strange issues. JAI only works when
running via the build tool i.e. `sbt run` when executed in spark.

When executed via spark-submit the error is:

    java.lang.IllegalArgumentException: The input argument(s) may not be
null.
    at
javax.media.jai.ParameterBlockJAI.getDefaultMode(ParameterBlockJAI.java:136)
    at javax.media.jai.ParameterBlockJAI.<init>(ParameterBlockJAI.java:157)
    at javax.media.jai.ParameterBlockJAI.<init>(ParameterBlockJAI.java:178)
    at
org.geotools.process.raster.PolygonExtractionProcess.execute(PolygonExtractionProcess.java:171)

Which looks like some native dependency (I think GEOS is running in the
background) is not there correctly.

Assuming something is wrong with the class path I tried to run a plain
java/scala function. but this one works just fine.

Is spark messing with the class paths?

I created a minimal example here:
https://github.com/geoHeil/jai-packaging-problem


Hope someone can shed some light on this problem,
Regards,
Georg

Reply via email to