Hi Divya,

There is already "from_unixtime" exists in org.apache.spark.sql.frunctions,
Rabin has used that in the sql query,if you want to use it in dataframe DSL
you can try like this,

val new_df = df.select(from_unixtime($"time").as("newtime"))


Thanks,
Rishabh.

On Wed, Jul 20, 2016 at 4:21 PM, Rabin Banerjee <
dev.rabin.baner...@gmail.com> wrote:

> Hi Divya ,
>
> Try,
>
> val df = sqlContext.sql("select from_unixtime(ts,'YYYY-MM-dd') as `ts` from 
> mr")
>
> Regards,
> Rabin
>
> On Wed, Jul 20, 2016 at 12:44 PM, Divya Gehlot <divya.htco...@gmail.com>
> wrote:
>
>> Hi,
>> Could somebody share example of writing and calling udf which converts
>> unix tme stamp to date tiime .
>>
>>
>> Thanks,
>> Divya
>>
>
>

Reply via email to