yep something in line of val df = sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/yyyy HH:mm:ss.ss') as time ")
Note that this does not require a column from an already existing table. HTH Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>* http://talebzadehmich.wordpress.com *Disclaimer:* Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction. On 20 July 2016 at 12:22, Rishabh Bhardwaj <rbnex...@gmail.com> wrote: > Hi Divya, > > There is already "from_unixtime" exists in org.apache.spark.sql.frunctions, > Rabin has used that in the sql query,if you want to use it in > dataframe DSL you can try like this, > > val new_df = df.select(from_unixtime($"time").as("newtime")) > > > Thanks, > Rishabh. > > On Wed, Jul 20, 2016 at 4:21 PM, Rabin Banerjee < > dev.rabin.baner...@gmail.com> wrote: > >> Hi Divya , >> >> Try, >> >> val df = sqlContext.sql("select from_unixtime(ts,'YYYY-MM-dd') as `ts` from >> mr") >> >> Regards, >> Rabin >> >> On Wed, Jul 20, 2016 at 12:44 PM, Divya Gehlot <divya.htco...@gmail.com> >> wrote: >> >>> Hi, >>> Could somebody share example of writing and calling udf which converts >>> unix tme stamp to date tiime . >>> >>> >>> Thanks, >>> Divya >>> >> >> >