partitioning of data in hive is more for the reasons on how you layout data
in a well defined manner so that when you access your data , you request
only for specific data by specifying the partition columns in where clause.
to answer your question,
do you have to change your queries? out of the b
thats far more better :) ..
Please tell me few more things. Do i have to change my query if i create
table with partition on date? rest of the columns would be same as it is?
Also if i export that partitioned table to mysql, does schema of that table
would same as it was before partition?
On Tue,
Hi,
My hive jobs are failing with the below error. Haven't been able to figure
this one out. Can you please throw some light on this?
java.io.IOException: Split class
org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit not found
at
org.apache.hadoop.mapred.MapTask.getSplitDetails
can you explain more what you want to achieve?
On Tue, Jun 4, 2013 at 7:06 AM, ur lops wrote:
> Hi,
> I need to protect my intermediate data from the hive query. Could
> someone help me with it.
> Thanks in advance.
> Regards
> John
>
--
Nitin Pawar
It's weird. If I use tinyint, the following exception is thrown when
attempting to create a table
NestedThrowablesStackTrace:
org.h2.jdbc.JdbcSQLException: Data conversion error converting "'N' (SDS:
IS_COMPRESSED TINYINT NOT NULL)"; SQL statement:
INSERT INTO SDS
(SD_ID,INPUT_FORMAT,IS_COMPRESSE
well. There lies your answer. So. what you might consider doing is
altering that table column and change it to tinyint. That might be just
enough to trick it since i think the java code is expecting either a 0 or 1
- not 'false'.
might be worth a try.
On Mon, Jun 3, 2013 at 5:47 PM, Jamal B w
Hi,
I need to protect my intermediate data from the hive query. Could
someone help me with it.
Thanks in advance.
Regards
John
Hi Pavel,
I've not tried it with Cyrillic but create table does fail in Hive 11 if
you use Chinese characters in either the table name or column names. I
assume Cyrillic will fail as well. From what I've seen, any sort of Unicode
data is ok in the actual values themselves just not in the DDL.
On
It treats it as a boolean.
http://www.h2database.com/html/datatypes.html#boolean_type
On Mon, Jun 3, 2013 at 8:16 PM, Stephen Sprague wrote:
> it does. so its a bit datatype. that's a tad non-standard i'd say. what
> does your backend store (H2) do with that?
>
> {code}
> | SDS | CREATE T
it does. so its a bit datatype. that's a tad non-standard i'd say. what
does your backend store (H2) do with that?
{code}
| SDS | CREATE TABLE `SDS` (
`SD_ID` bigint(20) NOT NULL,
`INPUT_FORMAT` varchar(4000) CHARACTER SET latin1 COLLATE latin1_bin
DEFAULT NULL,
`IS_COMPRESSED` bit(1) N
I gave that a shot, but it didn't work. Could you run a describe table to
see if it matches this schema (assuming mysql):
--
-- Table structure for table `SDS`
--
/*!40101 SET @saved_cs_client = @@character_set_client */;
/*!40101 SET character_set_client = utf8 */;
CREATE TABLE IF NOT EXIST
not that this is any consolation but on my mysql instance, same cdh
release, i have the following for IS_COMPRESSED where you have 'false':
{code}
{metastore@etl1 root 13:13}>select SD_ID, is_compressed from SDS limit 10;
+---+---+
| SD_ID | is_compressed |
+---+---
Please forgive the cross post, but I could really use some help.
I have Hive setup using a remote metastore, backed by H2, and am able to
create tables, load data, and query them without issue. However, when I
restart the remote metastore, I can no longer query previously created
tables. 'show t
there is no delete semantic.
you either partition on the data you want to drop and use drop partition
(or drop table for the whole shebang) or you can do as Nitin suggests by
selecting the inverse of the data you want to delete and store it back into
the table itself. Not ideal but maybe it could
Hi,
When I run the hive test case, I keep getting the following error:
[echo] Project: serde
[javac] Compiling 36 source files to
/home/john/dev/hive-0.9.0-Intel/src/build/serde/test/classes
[javac] TestAvroSerdeUtils.java:24: cannot find symbol
[javac] symbol : class MiniDFSClus
Thanx for your response nitin. Anybody else have any better solution?
On Mon, Jun 3, 2013 at 1:27 PM, Nitin Pawar wrote:
> hive does not give you a record level deletion as of now.
>
> so unless you have partitioned, other option is you overwrite the table
> with data which you want
> please wa
hive does not give you a record level deletion as of now.
so unless you have partitioned, other option is you overwrite the table
with data which you want
please wait for others to suggest you more options. this one is just mine
and can be costly too
On Mon, Jun 3, 2013 at 12:36 PM, Hamza Asad
no, its not partitioned by date.
On Mon, Jun 3, 2013 at 11:19 AM, Nitin Pawar wrote:
> how is the data laid out?
> is it partitioned data by the date?
>
>
> On Mon, Jun 3, 2013 at 11:20 AM, Hamza Asad wrote:
>
>> Dear all,
>> How can i remove data of specific dates from HDFS using hi
18 matches
Mail list logo