sorry I didn't pay attention you are using pyspark, so ignore my reply, as I
only use Scala version.
Yong
From: java8...@hotmail.com
To: webe...@aim.com; user@spark.apache.org
Subject: RE: Java exception when showing join
Date: Mon, 25 Apr 2016 09:41:18 -0400
dispute_df.join(commen
to do it for lots of cases.
https://github.com/apache/spark/blob/master/sql/core/src/test/scala/org/apache/spark/sql/DataFrameSuite.scala
Yong
> Subject: Re: Java exception when showing join
> From: webe...@aim.com
> To: java8...@hotmail.com; user@spark.apache.org
> Date: Mon, 25 Apr 2
rom: webe...@aim.com
> To: user@spark.apache.org
> Subject: Java exception when showing join
>
> I am using pyspark with netezza. I am getting a java exception when
> trying to show the first row of a join. I can show the first row for
> of the two dataframes separately but not the
use "dispute_df.join(comments_df, dispute_df.COMMENTID ===
comments_df.COMMENTID).first()" instead.
Yong
Date: Fri, 22 Apr 2016 17:42:26 -0400
From: webe...@aim.com
To: user@spark.apache.org
Subject: Java exception when showing join
I am using pyspark with netezza. I am gett
I am using pyspark with netezza. I am getting a java exception when trying to
show the first row of a join. I can show the first row for of the two
dataframes separately but not the result of a join. I get the same error for
any action I take(first, collect, show). Am I doing something wrong