hi,

Table should be created as orc table like

 create table HiveTest (eid int,ename string,desig string,sal int,dept
string) clustered by (dept) into 3 buckets stored as orc TBLPROPERTIES
('transactional'='true') ;

*Thanks & Regards,*

*Srinivas T*

On Tue, Feb 24, 2015 at 9:47 AM, Srinivas Thunga <srinivas.thu...@gmail.com>
wrote:

> Hi,
>
> Below are new properties to be set in Hive-Site.xml
>
>  hive.support.concurrency – true
>  hive.enforce.bucketing – true
>  hive.exec.dynamic.partition.mode – nonstrict
>  hive.txn.manager – org.apache.hadoop.hive.ql.lockmgr.DbTxnManager
>  hive.compactor.initiator.on – true
>  hive.compactor.worker.threads – 1
>
> *Thanks & Regards,*
>
> *Srinivas T*
>
> On Mon, Feb 23, 2015 at 9:31 PM, Jessica Zhang <jiezh2...@gmail.com>
> wrote:
>
>> Thanks for the reply! Would you please elaborate which new properties for
>> hive 0.14?
>>
>> Jessica
>>
>> On Feb 23, 2015, at 1:03 AM, Srinivas Thunga <srinivas.thu...@gmail.com>
>> wrote:
>>
>> Apply new configuration properties for hive0.14 in hive-site.xml file
>> then drop or delete table
>>
>> On Monday, February 23, 2015, Jie Zhang <jiezh2...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I have an application using hive, and just upgraded from 0.13.1 to
>>> 0.14.0. However, a bunch of unit testcases, which are using embedded derby
>>> metastore, failing to drop table. Here are the exception stack trace. Does
>>> anyone have clue what the problem can be and how to resolve it? Feedback is
>>> really appreciated. Thanks!
>>>
>>> ERROR 2015-02-22 22:38:26,757 [main] [StmtCacheTest] [line 308]
>>> SQLException when creating hive data in test table: [stmtcache_test]
>>>
>>> java.sql.SQLException: Error while processing statement: FAILED:
>>> Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
>>> MetaException(message:javax.jdo.JDOException: Exception thrown when
>>> executing query
>>>
>>> at
>>> org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:596)
>>>
>>> at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:275)
>>>
>>> at
>>> org.apache.hadoop.hive.metastore.ObjectStore.listMIndexes(ObjectStore.java:3133)
>>>
>>> at
>>> org.apache.hadoop.hive.metastore.ObjectStore.getIndexes(ObjectStore.java:3107)
>>>
>>> at sun.reflect.GeneratedMethodAccessor86.invoke(Unknown Source)
>>>
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>
>>> at
>>> org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:98)
>>>
>>> at com.sun.proxy.$Proxy8.getIndexes(Unknown Source)
>>>
>>> at
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1465)
>>>
>>> at
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1657)
>>>
>>> at sun.reflect.GeneratedMethodAccessor85.invoke(Unknown Source)
>>>
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>
>>> at
>>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:102)
>>>
>>> at com.sun.proxy.$Proxy9.drop_table_with_environment_context(Unknown
>>> Source)
>>>
>>> at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.drop_table_with_environment_context(HiveMetaStoreClient.java:1890)
>>>
>>> at
>>> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.drop_table_with_environment_context(SessionHiveMetaStoreClient.java:117)
>>>
>>> at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:855)
>>>
>>> at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:791)
>>>
>>> at sun.reflect.GeneratedMethodAccessor73.invoke(Unknown Source)
>>>
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>
>>> at
>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:90)
>>>
>>> at com.sun.proxy.$Proxy10.dropTable(Unknown Source)
>>>
>>> at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:980)
>>>
>>> at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:917)
>>>
>>> at org.apache.hadoop.hive.ql.exec.DDLTask.dropTable(DDLTask.java:3673)
>>>
>>> at
>>> org.apache.hadoop.hive.ql.exec.DDLTask.dropTableOrPartitions(DDLTask.java:3608)
>>>
>>> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:320)
>>>
>>> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
>>>
>>> at
>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>>>
>>> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1604)
>>>
>>> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1364)
>>>
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1177)
>>>
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1004)
>>>
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:999)
>>>
>>> at
>>> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:144)
>>>
>>> at
>>> org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:69)
>>>
>>> at
>>> org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:196)
>>>
>>> at java.security.AccessController.doPrivileged(Native Method)
>>>
>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>
>>> at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>
>>> at
>>> org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:536)
>>>
>>> at
>>> org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:208)
>>>
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> - Jessica
>>>
>>
>>
>> --
>> Sent from Gmail Mobile
>>
>>
>

Reply via email to