Correct. What doesn't work in Spark are actually the transactions, because there's a piece in the execution side that needs to send heartbeats to the metastore saying a transaction is still alive. That hasn't been implemented for Spark. It's very simple and could be done (see ql.exec.Heartbeater use in ql.exec.tez.TezJobMonitor for an example of how it would work). AFAIK everything else would work just fine.

Alan.

Mich Talebzadeh <mailto:m...@peridale.co.uk>
December 22, 2015 at 13:45

Thanks for the feedback Alan

It seems that one can do INSERTS with Hive on Spark but no updates or deletes. Is this correct?

Cheers,

Mich Talebzadeh

/Sybase ASE 15 Gold Medal Award 2008/

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books*"A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7*.

co-author *"Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4*

_Publications due shortly:___

*Complex Event Processing in Heterogeneous Environments*, ISBN: 978-0-9563693-3-8

*Oracle and Sybase, Concepts and Contrasts*, ISBN: 978-0-9563693-1-4, volume one out shortly

http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/>

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

*From:*Alan Gates [mailto:alanfga...@gmail.com]
*Sent:* 22 December 2015 20:39
*To:* user@hive.apache.org
*Subject:* Re: Attempt to do update or delete using transaction manager that does not support these operations. (state=42000,code=10294)

Also note that transactions only work with MR or Tez as the backend. The required work to have them work with Spark hasn't been done.

Alan.


Alan Gates <mailto:alanfga...@gmail.com>
December 22, 2015 at 12:38
Also note that transactions only work with MR or Tez as the backend. The required work to have them work with Spark hasn't been done.

Alan.

Mich Talebzadeh <mailto:m...@peridale.co.uk>
December 22, 2015 at 9:14

Thanks Elliot,

Sounds like that table was created as create table tt as select * from t. Although the original table t was created as transactional shown below, the table tt is not!

0: jdbc:hive2://rhes564:10010/default> show create table t;

+-------------------------------------------------------------+--+

|                       createtab_stmt                        |

+-------------------------------------------------------------+--+

| CREATE TABLE `t`(                                           |

|   `owner` varchar(30),                                      |

|   `object_name` varchar(30),                                |

|   `subobject_name` varchar(30),                             |

|   `object_id` bigint,                                       |

|   `data_object_id` bigint,                                  |

|   `object_type` varchar(19),                                |

|   `created` timestamp,                                      |

|   `last_ddl_time` timestamp,                                |

|   `timestamp2` varchar(19),                                 |

|   `status` varchar(7),                                      |

|   `temporary2` varchar(1),                                  |

|   `generated` varchar(1),                                   |

|   `secondary` varchar(1),                                   |

|   `namespace` bigint,                                       |

|   `edition_name` varchar(30),                               |

|   `padding1` varchar(4000),                                 |

|   `padding2` varchar(3500),                                 |

|   `attribute` varchar(32),                                  |

|   `op_type` int,                                            |

|   `op_time` timestamp,                                      |

|   `new_col` varchar(30))                                    |

| CLUSTERED BY (                                              |

|   object_id)                                                |

| INTO 256 BUCKETS                                            |

| ROW FORMAT SERDE                                            |

|   'org.apache.hadoop.hive.ql.io.orc.OrcSerde'               |

| STORED AS INPUTFORMAT                                       |

|   'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'         |

| OUTPUTFORMAT                                                |

|   'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'        |

| LOCATION                                                    |

|   'hdfs://rhes564:9000/user/hive/warehouse/asehadoop.db/t'  |

| TBLPROPERTIES (                                             |

|   'COLUMN_STATS_ACCURATE'='false',                          |

|   'last_modified_by'='hduser',                              |

|   'last_modified_time'='1449831076',                        |

|   'numFiles'='17',                                          |

|   'numRows'='-1',                                           |

|   'orc.bloom.filter.columns'='object_id',                   |

|   'orc.bloom.filter.fpp'='0.05',                            |

|   'orc.compress'='SNAPPY',                                  |

|   'orc.create.index'='true',                                |

|   'orc.row.index.stride'='10000',                           |

|   'orc.stripe.size'='268435456',                            |

|   'rawDataSize'='-1',                                       |

|   'totalSize'='64438212',                                   |

| 'transactional'='true',                                   |

|   'transient_lastDdlTime'='1449831076')                     |

+-------------------------------------------------------------+--+

49 rows selected (0.06 seconds)

Mich Talebzadeh

/Sybase ASE 15 Gold Medal Award 2008/

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books*"A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7*.

co-author *"Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4*

_Publications due shortly:___

*Complex Event Processing in Heterogeneous Environments*, ISBN: 978-0-9563693-3-8

*Oracle and Sybase, Concepts and Contrasts*, ISBN: 978-0-9563693-1-4, volume one out shortly

http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/>

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

*From:*Elliot West [mailto:tea...@gmail.com]
*Sent:* 22 December 2015 16:57
*To:* user@hive.apache.org
*Subject:* Re: Attempt to do update or delete using transaction manager that does not support these operations. (state=42000,code=10294)

Hi,

The input/output formats do not appear to be ORC, have you tried 'stored as orc'? Additionally you'll need to set the property 'transactional=true' on the table. Do you have the original create table statement?

Cheers - Elliot.

On Tuesday, 22 December 2015, Mich Talebzadeh <m...@peridale.co.uk <mailto:m...@peridale.co.uk>> wrote:

Elliot West <mailto:tea...@gmail.com>
December 22, 2015 at 8:57
Hi,

The input/output formats do not appear to be ORC, have you tried 'stored as orc'? Additionally you'll need to set the property 'transactional=true' on the table. Do you have the original create table statement?

Cheers - Elliot.

On Tuesday, 22 December 2015, Mich Talebzadeh <m...@peridale.co.uk <mailto:m...@peridale.co.uk>> wrote:

Reply via email to