I failed to reproduce your error. How did you set up your project: SBT, Maven? Maybe its dependency management is referring to an old version of flink? Maybe different versions of scala are mixed?
In that case, you may try setting up a new project: https://ci.apache.org/projects/flink/flink-docs-release-1.2/quickstart/ scala_api_quickstart.html When do you get the error? During compilation in eclipse? After submitting the job to flink? Nico On Wednesday, 12 April 2017 01:15:37 CEST Kürşat Kurt wrote: > I have downloaded latest binary > (http://www.apache.org/dyn/closer.lua/flink/flink-1.2.0/flink-1.2.0-bin-had > oop27-scala_2.11.tgz). I am getting this error in eclipse Neon(3) > > Regards, > Kursat > > -----Original Message----- > From: Nico Kruber [mailto:n...@data-artisans.com] > Sent: Tuesday, April 11, 2017 3:34 PM > To: user@flink.apache.org > Cc: Kürşat Kurt <kur...@kursatkurt.com> > Subject: Re: Aggregation problem. > > maxBy() is still a member of org.apache.flink.api.scala.GroupedDataSet in > the current sources - what did you upgrade flink to? > > Also please make sure the new version is used, or - if compiled from sources > - try a "mvn clean install" to get rid of old intermediate files. > > > Regards > Nico > > On Sunday, 9 April 2017 00:38:23 CEST Kürşat Kurt wrote: > > Hi; > > > > > > > > I have just upgraded flink and cant use maxBy on grouped dataset. > > > > I am getting the error below. > > > > > > > > value maxBy is not a member of org.apache.flink.api.scala.GroupedDataSet > > > > > > > > > > > > > > > > From: Kürşat Kurt [mailto:kur...@kursatkurt.com] > > Sent: Sunday, February 19, 2017 1:28 AM > > To: user@flink.apache.org > > Subject: RE: Aggregation problem. > > > > > > > > Yes, it works. > > > > Thank you Yassine. > > > > > > > > From: Yassine MARZOUGUI [mailto:y.marzou...@mindlytix.com] > > Sent: Saturday, February 18, 2017 2:48 PM > > To: user@flink.apache.org <mailto:user@flink.apache.org> > > Subject: RE: Aggregation problem. > > > > > > > > Hi, > > > > > > > > I think this is an expected output and not necessarily a bug. To get the > > element having the maximum value, maxBy() should be used instead of max(). > > > > > > > > See this answer for more details : > > http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Wrong-> > > > a > > nd-non-consistent-behavior-of-max-tp484p488.html > > > > > > > > Best, > > > > Yassine > > > > > > > > On Feb 18, 2017 12:28, "Kürşat Kurt" <kur...@kursatkurt.com > > <mailto:kur...@kursatkurt.com> > wrote: > > > > Ok, i have opened the issue with the test case. > > > > Thanks. > > > > > > > > https://issues.apache.org/jira/browse/FLINK-5840 > > > > > > > > > > > > From: Fabian Hueske [mailto:fhue...@gmail.com] > > Sent: Saturday, February 18, 2017 3:33 AM > > To: user@flink.apache.org <mailto:user@flink.apache.org> > > Subject: Re: Aggregation problem. > > > > > > > > Hi, > > > > this looks like a bug to me. > > > > Can you open a JIRA and maybe a small testcase to reproduce the issue? > > > > Thank you, > > > > Fabian > > > > > > > > 2017-02-18 1:06 GMT+01:00 Kürşat Kurt <kur...@kursatkurt.com > > <mailto:kur...@kursatkurt.com> >: > > > > Hi; > > > > > > > > I have a Dataset like this: > > > > > > > > (0,Auto,0.4,1,5.8317538999854194E-5) > > > > (0,Computer,0.2,1,4.8828125E-5) > > > > (0,Sports,0.4,2,1.7495261699956258E-4) > > > > (1,Auto,0.4,1,1.7495261699956258E-4) > > > > (1,Computer,0.2,1,4.8828125E-5) > > > > (1,Sports,0.4,1,5.8317538999854194E-5) > > > > > > > > This code; ds.groupBy(0).max(4).print() prints : > > > > > > > > (0,Sports,0.4,1,1.7495261699956258E-4) > > > > (1,Sports,0.4,1,1.7495261699956258E-4) > > > > > > > > ..but i am expecting > > > > > > > > (0,Sports,0.4,2,1.7495261699956258E-4) > > > > (1,Auto,0.4,1,1.7495261699956258E-4) > > > > > > > > What is wrong with this code?
signature.asc
Description: This is a digitally signed message part.