[ 
https://issues.apache.org/jira/browse/SPARK-50959?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated SPARK-50959:
-----------------------------------
    Labels: pull-request-available  (was: )

> ImportError: sys.meta_path is None, Python is likely shutting down Exception 
> ignored in: <function JavaWrapper.__del__ at 0x77bbd652bb00>
> -----------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-50959
>                 URL: https://issues.apache.org/jira/browse/SPARK-50959
>             Project: Spark
>          Issue Type: Bug
>          Components: ML, PySpark
>    Affects Versions: 4.0.0
>            Reporter: Bobby Wang
>            Priority: Major
>              Labels: pull-request-available
>
> When I try a simple LogisticRegression demo, I found it will throw exception 
> at the end
> _from pyspark.ml.classification import LogisticRegression_
> _from pyspark.ml.linalg import Vectors_
> _from pyspark.sql import SparkSession_
> _spark = SparkSession.builder.master("local[*]").getOrCreate()_
> _dataset = spark.createDataFrame(_
>  _[(Vectors.dense([0.0]), 0.0),_
>  _(Vectors.dense([0.4]), 1.0),_
>  _(Vectors.dense([0.5]), 0.0),_
>  _(Vectors.dense([0.6]), 1.0),_
>  _(Vectors.dense([1.0]), 1.0)] * 10,_
>  _["features", "label"])_
> _lr = LogisticRegression()_
> _model = lr.fit(dataset)_
>  
> _Exception ignored in: <function JavaWrapper.__del__ at 0x70b8caf2f920>_      
>     
> _Traceback (most recent call last):_
>   _File "/home/xxx/work.d/spark/spark-master/python/pyspark/ml/util.py", line 
> 254, in wrapped_
>   _File "/home/xxx/work.d/spark/spark-master/python/pyspark/ml/wrapper.py", 
> line 60, in __del___
> _ImportError: sys.meta_path is None, Python is likely shutting down_
> _Exception ignored in: <function JavaWrapper.__del__ at 0x70b8caf2f920>_
> _Traceback (most recent call last):_
>   _File "/home/xxx/work.d/spark/spark-master/python/pyspark/ml/util.py", line 
> 254, in wrapped_
>   _File "/home/xxx/work.d/spark/spark-master/python/pyspark/ml/wrapper.py", 
> line 60, in __del___
> _ImportError: sys.meta_path is None, Python is likely shutting down_
>  
>  
> Looks like the "python shutdown" happens before __del__ function in 
> JavaWrapper. 
> When I remove the `@try_remote_del` from 
> [https://github.com/apache/spark/blob/master/python/pyspark/ml/wrapper.py#L58]
>  , the issue is still there



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to