[ https://issues.apache.org/jira/browse/HIVE-21725?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16838947#comment-16838947 ]
Hive QA commented on HIVE-21725: -------------------------------- Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12968609/HIVE-21725.02.patch {color:red}ERROR:{color} -1 due to no test(s) being added or modified. {color:red}ERROR:{color} -1 due to 43 failed/errored test(s), 16007 tests executed *Failed tests:* {noformat} org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[input3] (batchId=90) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[show_columns] (batchId=46) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[allow_change_col_type_par_neg] (batchId=100) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[alter_partition_change_col_dup_col] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[alter_partition_change_col_nonexist] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[alter_table_constraint_duplicate_pk] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[alter_table_constraint_invalid_fk_col1] (batchId=100) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[alter_table_constraint_invalid_fk_col2] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[alter_table_constraint_invalid_fk_tbl1] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[alter_table_constraint_invalid_fk_tbl2] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[alter_table_constraint_invalid_pk_col] (batchId=100) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[alter_table_constraint_invalid_pk_tbl] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[alter_table_constraint_invalid_ref] (batchId=100) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[altern1] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[avro_add_column_extschema] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[column_rename1] (batchId=100) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[column_rename2] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[column_rename4] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[disallow_incompatible_type_change_on1] (batchId=100) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[disallow_incompatible_type_change_on2] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[drop_invalid_constraint1] (batchId=100) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[drop_invalid_constraint2] (batchId=100) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[drop_invalid_constraint3] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[drop_invalid_constraint4] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[hms_using_serde_alter_table_update_columns] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[orc_reorder_columns1] (batchId=100) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[orc_reorder_columns1_acid] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[orc_reorder_columns2] (batchId=100) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[orc_reorder_columns2_acid] (batchId=100) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[orc_replace_columns1] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[orc_replace_columns1_acid] (batchId=102) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[orc_replace_columns2] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[orc_replace_columns2_acid] (batchId=102) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[orc_replace_columns3] (batchId=100) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[orc_replace_columns3_acid] (batchId=100) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[orc_type_promotion1] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[orc_type_promotion1_acid] (batchId=100) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[orc_type_promotion2] (batchId=101) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[orc_type_promotion2_acid] (batchId=100) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[orc_type_promotion3] (batchId=100) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[orc_type_promotion3_acid] (batchId=102) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[parquet_alter_part_table_drop_columns] (batchId=101) org.apache.hadoop.hive.ql.parse.TestReplAcidTablesBootstrapWithJsonMessage.testBootstrapAcidTablesDuringIncrementalWithConcurrentWrites (batchId=248) {noformat} Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/17201/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/17201/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-17201/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.YetusPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 43 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12968609 - PreCommit-HIVE-Build > Break up DDLTask - extract Column and Constraint related operations > ------------------------------------------------------------------- > > Key: HIVE-21725 > URL: https://issues.apache.org/jira/browse/HIVE-21725 > Project: Hive > Issue Type: Sub-task > Components: Hive > Affects Versions: 3.1.1 > Reporter: Miklos Gergely > Assignee: Miklos Gergely > Priority: Major > Labels: refactor-ddl > Fix For: 4.0.0 > > Attachments: HIVE-21725.01.patch, HIVE-21725.02.patch > > > DDLTask is a huge class, more than 5000 lines long. The related DDLWork is > also a huge class, which has a field for each DDL operation it supports. The > goal is to refactor these in order to have everything cut into more > handleable classes under the package org.apache.hadoop.hive.ql.exec.ddl: > * have a separate class for each operation > * have a package for each operation group (database ddl, table ddl, etc), so > the amount of classes under a package is more manageable > * make all the requests (DDLDesc subclasses) immutable > * DDLTask should be agnostic to the actual operations > * right now let's ignore the issue of having some operations handled by > DDLTask which are not actual DDL operations (lock, unlock, desc...) > In the interim time when there are two DDLTask and DDLWork classes in the > code base the new ones in the new package are called DDLTask2 and DDLWork2 > thus avoiding the usage of fully qualified class names where both the old and > the new classes are in use. > Step #9: extract all the column and constraint related operations from the > old DDLTask, and move them under the new package each. -- This message was sent by Atlassian JIRA (v7.6.3#76005)