[
https://issues.apache.org/jira/browse/SPARK-17709?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15576141#comment-15576141
]
Ashish Shrowty edited comment on SPARK-17709 at 10/14/16 6:41 PM:
------------------------------------------------------------------
There is a slight difference, in my case the IDs generated are the same for
e.g. companyid#121 in
both aggregates, whereas in your plan the ids are difference companyid#5 and
companyid#46. This is probably causing the resolution error?
I will try in the 2.0.1 branch later today.
was (Author: ashrowty):
There is a slight difference, in my case the IDs generated are the same for
e.g. companyid#121 in both aggregates, whereas in your plan the ids are
difference companyid#5 and companyid#46. This is probably causing the
resolution error?
> spark 2.0 join - column resolution error
> ----------------------------------------
>
> Key: SPARK-17709
> URL: https://issues.apache.org/jira/browse/SPARK-17709
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.0
> Reporter: Ashish Shrowty
> Priority: Critical
>
> If I try to inner-join two dataframes which originated from the same initial
> dataframe that was loaded using spark.sql() call, it results in an error -
> // reading from Hive .. the data is stored in Parquet format in Amazon S3
> val d1 = spark.sql("select * from <hivetable>")
> val df1 = d1.groupBy("key1","key2")
> .agg(avg("totalprice").as("avgtotalprice"))
> val df2 = d1.groupBy("key1","key2")
> .agg(avg("itemcount").as("avgqty"))
> df1.join(df2, Seq("key1","key2")) gives error -
> org.apache.spark.sql.AnalysisException: using columns ['key1,'key2] can
> not be resolved given input columns: [key1, key2, avgtotalprice, avgqty];
> If the same Dataframe is initialized via spark.read.parquet(), the above code
> works. This same code above worked with Spark 1.6.2
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]