[ https://issues.apache.org/jira/browse/HIVE-22997?focusedWorklogId=407015&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-407015 ]
ASF GitHub Bot logged work on HIVE-22997: ----------------------------------------- Author: ASF GitHub Bot Created on: 20/Mar/20 11:50 Start Date: 20/Mar/20 11:50 Worklog Time Spent: 10m Work Description: anishek commented on pull request #951: HIVE-22997 : Copy external table to target during Repl Dump operation URL: https://github.com/apache/hive/pull/951#discussion_r395586086 ########## File path: ql/src/java/org/apache/hadoop/hive/ql/exec/repl/ReplDumpTask.java ########## @@ -662,15 +696,37 @@ void dumpTable(String dbName, String tblName, String validTxnList, Path dbRoot, replLogger.tableLog(tblName, tableSpec.tableHandle.getTableType()); if (tableSpec.tableHandle.getTableType().equals(TableType.EXTERNAL_TABLE) || Utils.shouldDumpMetaDataOnly(conf)) { - return; + return Collections.EMPTY_LIST; } - for (ReplPathMapping replPathMapping: replPathMappings) { - Task<?> copyTask = ReplCopyTask.getLoadCopyTask( - tuple.replicationSpec, replPathMapping.getSrcPath(), replPathMapping.getTargetPath(), conf, false); - this.addDependentTask(copyTask); - LOG.info("Scheduled a repl copy task from [{}] to [{}]", - replPathMapping.getSrcPath(), replPathMapping.getTargetPath()); + return replPathMappings; + } + + private void intitiateDataCopyTasks() throws SemanticException { + Iterator<ExternalTableCopyTaskBuilder.DirCopyWork> extCopyWorkItr = work.getDirCopyIterator(); + List<Task<?>> childTasks = new ArrayList<>(); + int maxTasks = conf.getIntVar(HiveConf.ConfVars.REPL_APPROX_MAX_LOAD_TASKS); + TaskTracker taskTracker = new TaskTracker(maxTasks); + while (taskTracker.canAddMoreTasks() && hasMoreCopyWork()) { + if (work.replPathIteratorInitialized() && extCopyWorkItr.hasNext()) { + childTasks.addAll(new ExternalTableCopyTaskBuilder(work, conf).tasks(taskTracker)); + } else { + childTasks.addAll(ReplPathMapping.tasks(work, taskTracker, conf)); + } + } + if (!childTasks.isEmpty()) { + DAGTraversal.traverse(childTasks, new AddDependencyToLeaves(TaskFactory.get(work, conf))); + } else { + prepareReturnValues(work.getResultValues()); + childTasks.add(TaskFactory.get(new ReplOperationCompleteAckWork(work.getDumpAckFile()), conf)); } Review comment: why is ack work only if the child task is empty, there is possibility of adding the ack work at the end of all the dag, if the dag can contain more nodes. may be need to move it out of else clause and do it based on tracker.canAddMoreTasks or something? ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking ------------------- Worklog Id: (was: 407015) Time Spent: 7h 40m (was: 7.5h) > Copy external table to target during Repl Dump operation > -------------------------------------------------------- > > Key: HIVE-22997 > URL: https://issues.apache.org/jira/browse/HIVE-22997 > Project: Hive > Issue Type: Task > Reporter: PRAVIN KUMAR SINHA > Assignee: PRAVIN KUMAR SINHA > Priority: Major > Labels: pull-request-available > Attachments: HIVE-22997.03.patch, HIVE-22997.04.patch, > HIVE-22997.1.patch, HIVE-22997.10.patch, HIVE-22997.11.patch, > HIVE-22997.12.patch, HIVE-22997.13.patch, HIVE-22997.14.patch, > HIVE-22997.15.patch, HIVE-22997.2.patch, HIVE-22997.4.patch, > HIVE-22997.5.patch, HIVE-22997.6.patch, HIVE-22997.7.patch, > HIVE-22997.8.patch, HIVE-22997.9.patch > > Time Spent: 7h 40m > Remaining Estimate: 0h > -- This message was sent by Atlassian Jira (v8.3.4#803005)