try putting join condition as String
On Mon, Aug 22, 2016 at 5:00 PM, Subhajit Purkayastha
wrote:
> *All,*
>
>
>
> *I have the following dataFrames and the temp table. *
>
>
>
> *I am trying to create a new DF , the following statement is not compiling*
>
>
>
> *val* *df** = **sales_demand**.**j
It seems Spark is not able to serialize your function code to worker nodes.
I have tried to put a solution in simple set of commands. Maybe you can
combine last four line into function.
val arr = Array((1,"A","<20","0"), (1,"A",">20 & <40","1"), (1,"B",">20 &
<40","0"), (1,"C",">20 & <40","0"),
1. foreach doesn't expect any value from function being passed (in your
func_foreach). so nothing happens. The return values are just lost. it's
like calling a function without saving return value to another var.
foreach also doesn't return anything so you don't get modified RDD (like
map*).
2. RDD