Divya:
https://databricks.com/blog/2015/09/16/spark-1-5-dataframe-api-highlights-datetimestring-handling-time-intervals-and-udafs.html
The link gives a complete example of registering a udAf - user defined
aggregate function. This is a complete example and this example should give you
a complet
On Thu, Jul 21, 2016 at 5:53 AM, Mich Talebzadeh
wrote:
> something similar
Is this going to be in Scala?
> def ChangeToDate (word : String) : Date = {
> //return
> TO_DATE(FROM_UNIXTIME(UNIX_TIMESTAMP(word,"dd/MM/"),"-MM-dd"))
> val d1 = Date.valueOf(ReverseDate(word))
> return d1
On Thu, Jul 21, 2016 at 4:53 AM, Divya Gehlot wrote:
> To be very specific I am looking for UDFs syntax for example which takes
> String as parameter and returns integer .. how do we define the return type
val f: String => Int = ???
val myUDF = udf(f)
or
val myUDF = udf[String, Int] { ??? }
o
On Wed, Jul 20, 2016 at 1:22 PM, Rishabh Bhardwaj wrote:
> val new_df = df.select(from_unixtime($"time").as("newtime"))
or better yet using tick (less typing and more prose than code :))
df.select(from_unixtime('time) as "newtime")
Jacek
---
m: Rishabh Bhardwaj
>> Date: Wednesday, July 20, 2016 at 4:22 AM
>> To: Rabin Banerjee
>> Cc: Divya Gehlot , "user @spark" <
>> user@spark.apache.org>
>> Subject: Re: write and call UDF in spark dataframe
>>
>> Hi Divya,
>>
>> There is
ark" <
> user@spark.apache.org>
> Subject: Re: write and call UDF in spark dataframe
>
> Hi Divya,
>
> There is already "from_unixtime" exists in org.apache.spark.sql.frunctions,
> Rabin has used that in the sql query,if you want to use it in
> dataframe DS
Gehlot , "user @spark"
Subject: Re: write and call UDF in spark dataframe
> Hi Divya,
>
> There is already "from_unixtime" exists in org.apache.spark.sql.frunctions,
> Rabin has used that in the sql query,if you want to use it in dataframe DSL
> you can try like
yep something in line of
val df = sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
HH:mm:ss.ss') as time ")
Note that this does not require a column from an already existing table.
HTH
Dr Mich Talebzadeh
LinkedIn *
https://www.linkedin.com/profile/view?id=AAEWh2gBxianr
Hi Divya,
There is already "from_unixtime" exists in org.apache.spark.sql.frunctions,
Rabin has used that in the sql query,if you want to use it in dataframe DSL
you can try like this,
val new_df = df.select(from_unixtime($"time").as("newtime"))
Thanks,
Rishabh.
On Wed, Jul 20, 2016 at 4:21 PM
Hi Divya ,
Try,
val df = sqlContext.sql("select from_unixtime(ts,'-MM-dd') as `ts` from mr")
Regards,
Rabin
On Wed, Jul 20, 2016 at 12:44 PM, Divya Gehlot
wrote:
> Hi,
> Could somebody share example of writing and calling udf which converts
> unix tme stamp to date tiime .
>
>
> Thanks,
>
10 matches
Mail list logo