select *
from (select *,
             rank() over (order by transactiondate) r
       from ll_18740868 where transactiondescription='XYZ'
      ) inner
where r=1

Hi Mitch,

If using SQL is fine, you can try the code above. You need to register
ll_18740868  as temp table.

On Sun, Jul 31, 2016 at 6:49 AM, Mich Talebzadeh <mich.talebza...@gmail.com>
wrote:

>
> Hi,
>
> I would like to find out when it was the last time I paid a company with
> Debit Card
>
>
> This is the way I do it.
>
> 1) Find the date when I paid last
> 2) Find the rest of details from the row(s)
>
> So
>
> var HASHTAG = "XYZ"
> scala> var maxdate =
> ll_18740868.filter(col("transactiondescription").contains(HASHTAG)).agg(max("transactiondate")).collect.apply(0)
> maxdate: org.apache.spark.sql.Row = [2015-12-15]
>
> OK so it was 2015-12-15
>
>
> Now I want to get the rest of the columns. This one works when I hard code
> the maxdate!
>
>
> scala> ll_18740868.filter(col("transactiondescription").contains(HASHTAG)
> && col("transactiondate") === "2015-12-15").select("transactiondate",
> "transactiondescription", "debitamount").show
> +---------------+----------------------+-----------+
> |transactiondate|transactiondescription|debitamount|
> +---------------+----------------------+-----------+
> |     2015-12-15|  XYZ LTD CD 4636 |      10.95|
> +---------------+----------------------+-----------+
>
> Now if I want to use the var maxdate in place of "2015-12-15", how would I
> do that?
>
> I tried lit(maxdate) etc but they are all giving me error?
>
> java.lang.RuntimeException: Unsupported literal type class
> org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema
> [2015-12-15]
>
>
> Thanks
>



-- 
Best Regards,
Ayan Guha

Reply via email to