MyqueWooMiddo opened a new issue, #2313:
URL: https://github.com/apache/sedona/issues/2313

   Pyspark3 , python 3.7, jupyter ,sedona 1.7.2
   
   I place sedona-shaded.jar and geotools.jar in /spark/sedona ,  default 
spark's jars path is /spark/jars
   
   in python
   
   from sedona.spark import *
   config = (
       SedonaContext.builder().
       config("spark.jars", 
"/spark/sedona/sedona-shaded.jar,/spark/sedona/geotools.jar").
       
config("spark.kryo.registrator","org.apache.sedona.viz.core.Serde.SedonaVizKryoRegistrator").
       
config("spark.sql.extensions","org.apache.sedona.viz.sql.SedonaVizExtensions,org.apache.sedona.sql.SedonaSqlExtensions").
       getOrCreate()
   )
   sedona = SedonaContext.create(config)
   
   it throws 
   python3.7/site-packages/sedona/spark/SedonaContext.py in create(cls, spark)
        46         if not is_remote():
        47             PackageImporter.import_jvm_lib(spark._jvm)
   ---> 48             spark._jvm.SedonaContext.create(spark._jsparkSession, 
"python")
        49         return spark
        50 
   
   TypeError: 'JavaPackage' object is not callable
   
   Is it possible via specifying spark.jars rather than placing sedona.jar in 
/spark/jars/ ?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to