Which version of Spark are you using?  I just tried the example below on 1.5.1 
and it seems to work as expected:

scala> val res = df.groupBy("key").count.agg(min("count"), avg("count"))
res: org.apache.spark.sql.DataFrame = [min(count): bigint, avg(count): double]

scala> res.show
+----------+----------+
|min(count)|avg(count)|
+----------+----------+
|         1|       1.0|
+----------+----------+


scala> res.printSchema
root
 |-- min(count): long (nullable = true)
 |-- avg(count): double (nullable = true)

On Oct 21, 2015, at 11:12 AM, Carol McDonald <cmcdon...@maprtech.com> wrote:

> This used to work : 
> 
> // What's the min number of bids per item? what's the average? what's the 
> max? 
> auction.groupBy("item", "auctionid").count.agg(min("count"), 
> avg("count"),max("count")).show
> 
> // MIN(count) AVG(count)        MAX(count)
> // 1  16.992025518341308 75
> 
> but this now gives an error
> 
> val res = auction.groupBy("item", "auctionid").count.agg(min("count"), 
> avg("count"),max("count"))
> 
> <console>:42: error: Float does not take parameters
> 
> val res = auction.groupBy("item", "auctionid").count.agg(min("count"), 
> avg("count"),max("count"))
> 
> min and max still work . 
> 
> Do I need to cast the count to a float ? 
> 
> auction.groupBy("item", "auctionid").count.agg(min("count"), 
> max("count")).show
> 
> MIN(count) MAX(count)  
>  1          75  

Reply via email to