sorry I didn't pay attention you are using pyspark, so ignore my reply, as I
only use Scala version.
Yong
From: java8...@hotmail.com
To: webe...@aim.com; user@spark.apache.org
Subject: RE: Java exception when showing join
Date: Mon, 25 Apr 2016 09:41:18 -0400
dispute_df.join(commen
to do it for lots of cases.
https://github.com/apache/spark/blob/master/sql/core/src/test/scala/org/apache/spark/sql/DataFrameSuite.scala
Yong
> Subject: Re: Java exception when showing join
> From: webe...@aim.com
> To: java8...@hotmail.com; user@spark.apache.org
> Date: Mon, 25 Apr 2
I get an invalid syntax error when I do that.
On Fri, 2016-04-22 at 20:06 -0400, Yong Zhang wrote:
> use "dispute_df.join(comments_df, dispute_df.COMMENTID ===
> comments_df.COMMENTID).first()" instead.
>
> Yong
>
> Date: Fri, 22 Apr 2016 17:42:26 -0400
> From: webe...@aim.com
> To: user@spark.a
use "dispute_df.join(comments_df, dispute_df.COMMENTID ===
comments_df.COMMENTID).first()" instead.
Yong
Date: Fri, 22 Apr 2016 17:42:26 -0400
From: webe...@aim.com
To: user@spark.apache.org
Subject: Java exception when showing join
I am using pyspark with netezza. I am getting a java exception