Re: Changing table types from managed to external

2012-08-06 Thread Igor Tatarinov
Try ALTER TABLE SET TBLPROPERTIES('EXTERNAL'='TRUE'); It worked for me. igor decide.com On Mon, Aug 6, 2012 at 11:08 PM, Babe Ruth wrote: > Hello, > I created a managed table in HIVE when i intended for it to be external, > is it possible for me to change the table back to external? > > OR

Re: Changing table types from managed to external

2012-08-06 Thread Jan Dolinár
Hi George, You can save yourself one copying. Just create a new external table with different name, fill it with data (either by copying or query like INSERT OVERWRITE DIRECTORY '/new/table/path' SELECT * FROM oldtable), drop the old one and then rename the new one to the desired name: ALTER TABL

Re: Custom UserDefinedFunction in Hive

2012-08-06 Thread Raihan Jamal
I tested that function using main and by printing it out and it works fine. As I am trying to get the Yesterday's date. I need my query to be like this as today's date is Aug 6th, so query should be for Aug 5th. And this works fine for me. *SELECT * FROM REALTIME where dt= '20120805' LIMIT 10;*

Re:Changing table types from managed to external

2012-08-06 Thread long
Hi, George, I think that's the only way you can do now. -- Best Regards, longmans At 2012-08-07 14:08:09,"Babe Ruth" wrote: Hello, I created a managed table in HIVE when i intended for it to be external, is it possible for me to change the table back to external? OR do I have to copy the

Changing table types from managed to external

2012-08-06 Thread Babe Ruth
Hello, I created a managed table in HIVE when i intended for it to be external, is it possible for me to change the table back to external? OR do I have to copy the data to a new directory, drop the table, then copy it back? Thanks,George

Re: Custom UserDefinedFunction in Hive

2012-08-06 Thread Jan Dolinár
Hi Jamal, Check if the function really returns what it should and that your data are really in MMdd format. You can do this by simple query like this: SELECT dt, yesterdaydate('MMdd') FROM REALTIME LIMIT 1; I don't see anything wrong with the function itself, it works well for me (althou

Re: Caused by: java.io.EOFException

2012-08-06 Thread Techy Teck
Yes I created that file manually. But other files are fine only that particular file is having problem. Is there any way I can fix that file? On Mon, Aug 6, 2012 at 9:51 PM, shashwat shriparv wrote: > There are some extra information about which file system does not know, > have you build th

Re: question on output hive table to file

2012-08-06 Thread Vinod Singh
If output file is not too big then ^A can be replaced by using simple command like- $ tr "\001" "," < src_file > out_file Thanks, Vinod On Tue, Aug 7, 2012 at 10:27 AM, zuohua zhang wrote: > Thanks so much! that did work. I have 200+ columns so it is quite > an ugly thing. No shortcut?

Re: question on output hive table to file

2012-08-06 Thread zuohua zhang
Thanks so much! that did work. I have 200+ columns so it is quite an ugly thing. No shortcut? On Mon, Aug 6, 2012 at 9:50 PM, Vinod Singh wrote: > Change the query to something like- > > INSERT OVERWRITE DIRECTORY '/outputable.txt' > select concat(col1, ',', col2, ',', col3) from myoutp

Re: Caused by: java.io.EOFException

2012-08-06 Thread shashwat shriparv
There are some extra information about which file system does not know, have you build that file mannually? On Tue, Aug 7, 2012 at 6:01 AM, Techy Teck wrote: > Yup that makes sense. But when I tried opening that file using- > > hadoop fs -text > /apps/hdmi-technology/b_apdpds/real-time_new/20120

Re: question on output hive table to file

2012-08-06 Thread Vinod Singh
Change the query to something like- INSERT OVERWRITE DIRECTORY '/outputable.txt' select concat(col1, ',', col2, ',', col3) from myoutputtable; That way columns will be separated by ,. Thanks, Vinod On Tue, Aug 7, 2012 at 10:16 AM, zuohua zhang wrote: > I used the following that it won't help

Re: question on output hive table to file

2012-08-06 Thread zuohua zhang
I used the following that it won't help? ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' On Mon, Aug 6, 2012 at 9:43 PM, Vinod Singh wrote: > Columns of a Hive table are separated by ^A character. Instead of doing a > "SELECT * ", you may like to use concat function to have a separator of > your

Re: question on output hive table to file

2012-08-06 Thread Vinod Singh
Columns of a Hive table are separated by ^A character. Instead of doing a "SELECT * ", you may like to use concat function to have a separator of your choice. Thanks, Vinod On Tue, Aug 7, 2012 at 9:39 AM, zuohua zhang wrote: > I have used the following to output a hive table to a file: > DROP T

question on output hive table to file

2012-08-06 Thread zuohua zhang
I have used the following to output a hive table to a file: DROP TABLE IF EXISTS myoutputable; CREATE TABLE myoutputtable ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' STORED AS TEXTFILE AS select * from originaltable; INSERT OVERWRITE DIRECTORY '/outputable.txt' select * from myoutputtable; then

Re: Add Yesterday's date at runtime

2012-08-06 Thread Vinod Singh
There is no built-in support for such things in Hive. You may like to explore possibility of doing this via shell script or something else to calculate date dynamically. Thanks, Vinod On Tue, Aug 7, 2012 at 12:09 AM, Techy Teck wrote: > I am running *Hive 0.6 *and below is the content I have in

Re: Caused by: java.io.EOFException

2012-08-06 Thread Techy Teck
Yup that makes sense. But when I tried opening that file using- hadoop fs -text /apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2 I can see my file contents there? Then what's wrong with that file? And is there any way I can fix that error in that file usin

Re: Caused by: java.io.EOFException

2012-08-06 Thread Bejoy KS
It could be like the file corresponding to the partition dt='20120731' got corrupted. This file as pointed in the error logs should be the culprit. hdfs://ares-nn/apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2 Regards Bejoy KS Sent from handheld,

RE: (Get the current date -1) in Hive

2012-08-06 Thread carla.staeben
In the case here it literally is taking the UNIX timestamp, formatting it in -mm-dd format and then subtracting the specified integer (in this case 1) Sent from my Lumia 900 From: ext Techy Teck Sent: 8/6/2012 3:37 PM To: user@hive.apache.org Subject: Re: (Get

Caused by: java.io.EOFException

2012-08-06 Thread Techy Teck
I am writing a simple query on our hive table and I am getting some exception- select count(*) from table1 where dt='20120731'; java.io.IOException: IO error in map input file hdfs://ares-nn/apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2 at org

Re: (Get the current date -1) in Hive

2012-08-06 Thread Techy Teck
Thanks Carla for the suggestion, I am currently using Hive 0.6 and that Hive version doesn't supports variable substitution with hiveconf variable, so that is the reason I was looking for some other alternative- So you are saying basically, If I add your suggestion in my query like below- *select

RE: (Get the current date -1) in Hive

2012-08-06 Thread carla.staeben
If you are just using it in a query, you can do this: date_sub(FROM_UNIXTIME(UNIX_TIMESTAMP(),'-MM-dd') , 1) I generally do my date calculations in a shell script and pass them in with a hiveconf variable. Carla -Original Message- From: ext Yue Guan [mailto:pipeha...@gmail.com] Sen

Re: (Get the current date -1) in Hive

2012-08-06 Thread Yue Guan
guess you can use sub_date, but you have to get today by some outside script. On 08/06/2012 02:10 PM, Techy Teck wrote: Is there any way to get the current date -1 in Hive means yesterdays date always?

Add Yesterday's date at runtime

2012-08-06 Thread Techy Teck
I am running *Hive 0.6 *and below is the content I have in *hivetest1.hql*file. set mapred.job.queue.name=hdmi-technology; set mapred.output.compress=true; set mapred.output.compression.type=BLOCK; set mapred.output.compression.codec=org.apache.hadoop.io.compress.LzoCodec; add jar UserDefinedFunct

Re: drop table: no err on nonexistant table?

2012-08-06 Thread Keith Wiley
Oye, got it. Sorry. RTFM: hive.exec.drop.ignorenonexistent On Aug 6, 2012, at 11:06 , Keith Wiley wrote: > I'm wrapping hive in a web tool and would like to do some basic > error-checking. If an attempt is made to drop a table that doesn't exist, I > would like to show an error message. The

(Get the current date -1) in Hive

2012-08-06 Thread Techy Teck
Is there any way to get the current date -1 in Hive means yesterdays date always?

drop table: no err on nonexistant table?

2012-08-06 Thread Keith Wiley
I'm wrapping hive in a web tool and would like to do some basic error-checking. If an attempt is made to drop a table that doesn't exist, I would like to show an error message. The problem is, hive doesn't seem to produce any sort of error when dropping a table that doesn't exists. Furthermor

Special character replaced by '?'

2012-08-06 Thread Balaraman, Anand
Hi I am facing an issue while viewing special characters (such as é) using Hive. If I view the file in HDFS (using hadoop fs -cat command), it is displayed correctly as 'é', but when I select the data using Hive, this character alone gets replaced by a question mark. Do we have any solut

Re: mapper is slower than hive' mapper

2012-08-06 Thread Bertrand Dechoux
If you don't want to manage hive table, It doesn't necessarily means you need to use the vanilla MapReduce. If your workflow is complex using Hive, it won't be that easy to maintain it if everything is implemented directly using MapReduce. I would recommend you to look at libraries such as Cascadin