[
https://issues.apache.org/jira/browse/SPARK-17538?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15492937#comment-15492937
]
Srinivas Rishindra Pothireddi commented on SPARK-17538:
-------------------------------------------------------
Hi [~srowen], I will fix this as soon as possible as you suggested.
> sqlContext.registerDataFrameAsTable is not working sometimes in pyspark 2.0.0
> -----------------------------------------------------------------------------
>
> Key: SPARK-17538
> URL: https://issues.apache.org/jira/browse/SPARK-17538
> Project: Spark
> Issue Type: Bug
> Affects Versions: 2.0.0
> Environment: os - linux
> cluster -> yarn and local
> Reporter: Srinivas Rishindra Pothireddi
>
> I have a production job in spark 1.6.2 that registers four dataframes as
> tables. After testing the job in spark 2.0.0 one of the dataframes is not
> getting registered as a table.
> output of sqlContext.tableNames() just after registering the fourth dataframe
> in spark 1.6.2 is
> temp1,temp2,temp3,temp4
> output of sqlContext.tableNames() just after registering the fourth dataframe
> in spark 2.0.0 is
> temp1,temp2,temp3
> so when the table 'temp4' is used by the job at a later stage an
> AnalysisException is raised in spark 2.0.0
> There are no changes in the code whatsoever.
>
>
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]