Hi Vidya, I tried recreating a similar hive table and it worked fine. Here is a snippet from my terminal. I hope it helps.
I created a table name “testone” with three columns “name” “todaydate” and “lastname” with the row data as: “Metallica” “-5-FEB-01” “Today”. TERMINAL SNIPPET hive> select * from testone; OK Metallica 05-FEB-01 Today Time taken: 1.345 seconds, Fetched: 1 row(s) hive> select from_unixtime(unix_timestamp(todaydate,'dd-MMM-yy')) from testone; Query ID = root_20141026181818_b37aef53-4c0f-4b24-a54a-f25491a1807f Total jobs = 1 Launching Job 1 out of 1 Number of reduce tasks is set to 0 since there's no reduce operator Starting Job = job_1414362909167_0001, Tracking URL = http://sandbox.hortonworks.com:8088/proxy/application_1414362909167_0001/ Kill Command = /usr/lib/hadoop/bin/hadoop job -kill job_1414362909167_0001 Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0 2014-10-26 18:19:57,297 Stage-1 map = 0%, reduce = 0% 2014-10-26 18:20:57,752 Stage-1 map = 0%, reduce = 0% 2014-10-26 18:21:19,374 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 5.98 sec MapReduce Total cumulative CPU time: 5 seconds 980 msec Ended Job = job_1414362909167_0001 MapReduce Jobs Launched: Job 0: Map: 1 Cumulative CPU: 5.98 sec HDFS Read: 247 HDFS Write: 20 SUCCESS Total MapReduce CPU Time Spent: 5 seconds 980 msec OK 2001-02-05 00:00:00 Time taken: 182.391 seconds, Fetched: 1 row(s) Harpreet Bedi On Oct 26, 2014, at 5:11 PM, Vidya Sujeet <sjayatheer...@gmail.com> wrote: > yes, this retunrs NULLs.. from_unixtime(unix_timestamp('05-FEB-01', > 'dd-MMM-yy ')) > > > On Sun, Oct 26, 2014 at 2:09 PM, Harpreet Bedi <harpreet.be...@gmail.com> > wrote: > Hi Vidya, > > Have you tried using this instead? > > select from_unixtime(unix_timestamp('05-FEB-01', 'dd-MMM-yy ')) > > > > On Sunday, October 26, 2014, Vidya Sujeet <sjayatheer...@gmail.com> wrote: > Hi Harpreet, > > I used the below statement but it doesn't give me the correct time. It gives > me 2000-12-31 instead of 2001/02/05..what am I doing wrong? > > select from_unixtime(unix_timestamp('05-FEB-01', 'DD-MMM-YY')) > thanks, > Vidya > > > On Sun, Oct 26, 2014 at 1:06 PM, Harpreet Singh Bedi > <harpreet.be...@gmail.com> wrote: > You’ll have to use lowercase “y” and lowercase “d” for year and date of the > month. > >> On Oct 26, 2014, at 15:03, Harpreet Singh Bedi <harpreet.be...@gmail.com> >> wrote: >> >> Yup, that should work! >> >>> On Oct 26, 2014, at 15:00, Vidya Sujeet <sjayatheer...@gmail.com> wrote: >>> >>> Ok, the pattern should be (DD-MMM-YY) all in upper case! thanks >>> >>> On Sun, Oct 26, 2014 at 12:47 PM, Vidya Sujeet <sjayatheer...@gmail.com> >>> wrote: >>> (dd-mmm-yy) returns NULL >>> >>> On Sun, Oct 26, 2014 at 12:11 PM, Vidya Sujeet <sjayatheer...@gmail.com> >>> wrote: >>> But what should the format be if the source has data coming "25-FEB-01' ? >>> appreciate your help. >>> >>> On Sun, Oct 26, 2014 at 12:01 PM, Nagarjuna Vissarapu >>> <nagarjuna.v...@gmail.com> wrote: >>> Timestamp class is there to do this. >>> >>> On Oct 27, 2014 12:26 AM, "Vidya Sujeet" <sjayatheer...@gmail.com> wrote: >>> Hi, >>> >>> The date time format coming from the source is "25-FEB-01' . I want to >>> convert it to the following format. 'MM/DD/YYYY' . How can we do this in >>> Hive? >>> >>> >>> I see that as per the documentation >>> >>> https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF#LanguageManualUDF-DateFunctions >>> >>> I could possibly convert the string date to a Unix time stamp in seconds >>> using the below UDF. However, what is the string pattern if the date at >>> source is coming this way "25-FEB-01' ? The link provided to look up for >>> the patterns does not work. >>> >>> Please help. >>> >>> Name: unix_timestamp (string date, string pattern) function. >>> Description: Convert time string with given pattern (see >>> [http://java.sun.com/j2se/1.4.2/docs/api/java/text/SimpleDateFormat.html]) >>> to Unix time stamp (in seconds), return 0 if fail: >>> unix_timestamp('2009-03-20', 'yyyy-MM-dd') = 1237532400. >>> >>> >>> >>> Vidya >>> >>> >>> >>> >>> >>> >> > > > > > -- > Harpreet Bedi > >