Is this a duplicate with your Stackoverflow question?
http://stackoverflow.com/questions/34146800/convert-reducebykey-from-spark-to-flink

On Tue, Dec 8, 2015 at 2:25 PM, Humberto Moura <
humbe...@humbertomoura.com.br> wrote:

> Hello, guys
>
>
> I'm migrating a Spark code to Flink, and I realized that Spark reduceByKey
> doesn't exist.
>
> |the snip code I'm struggle: reduceByKey((x,y)=>(x._1
> +y._1,((x._2)++y._2))) So, with a help of a friend I've tried convert:
> ||groupBy(0).reduce {(v1,v2)=>(v1._1 +v2._1,((v1._2)++v2._2))} but there's
> and error on |(x._2) ++": "value ++ is not a member of
> (Int,scala.collection.immutable.Map[String,Int])".  Really, I don't know
> why.
>
> The complete line code is: val target = words.map(line => (line(0),
> (line(2).toInt, Map((line(1) -> line(2).toInt))))).groupBy(0).reduce { (x,
> y) => (x._1 + y._1, ( (x._2) ++  y._2) ) } Tks any help.
> -- *Humberto Moura, MSc* Professor / Consultor de TI / Inovação
> http://www.humbertomoura.com.br http://lattes.cnpq.br/3755174224507905
> Fone: (51) 9252-7855
>

Reply via email to