version 1.3.1 scala> auction.printSchema
root |-- auctionid: string (nullable = true) |-- bid: float (nullable = false) |-- bidtime: float (nullable = false) |-- bidder: string (nullable = true) |-- bidderrate: integer (nullable = true) |-- openbid: float (nullable = false) |-- price: float (nullable = false) |-- item: string (nullable = true) |-- daystolive: integer (nullable = true) scala> auction.groupBy("auctionid", "item").count.show auctionid item count 3016429446 palm 10 8211851222 xbox 28 On Wed, Oct 21, 2015 at 2:38 PM, Ali Tajeldin EDU <alitedu1...@gmail.com> wrote: > Which version of Spark are you using? I just tried the example below on > 1.5.1 and it seems to work as expected: > > scala> val res = df.groupBy("key").count.agg(min("count"), avg("count")) > res: org.apache.spark.sql.DataFrame = [min(count): bigint, avg(count): > double] > > scala> res.show > +----------+----------+ > |min(count)|avg(count)| > +----------+----------+ > | 1| 1.0| > +----------+----------+ > > > scala> res.printSchema > root > |-- min(count): long (nullable = true) > |-- avg(count): double (nullable = true) > > On Oct 21, 2015, at 11:12 AM, Carol McDonald <cmcdon...@maprtech.com> > wrote: > > This used to work : > > // What's the min number of bids per item? what's the average? what's the > max? > auction.groupBy("item", "auctionid").count.agg(min("count"), > avg("count"),max("count")).show > > // MIN(count) AVG(count) MAX(count) > // 1 16.992025518341308 75 > > but this now gives an error > > val res = auction.groupBy("item", "auctionid").count.agg(min("count"), > avg("count"),max("count")) > > <console>:42: error: Float does not take parameters > > val res = auction.groupBy("item", "auctionid").count.agg(min("count"), > avg("count"),max("count")) > > min and max still work . > > Do I need to cast the count to a float ? > > auction.groupBy("item", "auctionid").count.agg(min("count"), > max("count")).show > > MIN(count) MAX(count) > 1 75 > > >