This is an automated email from the ASF dual-hosted git repository.

zhangliang pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/shardingsphere.git


The following commit(s) were added to refs/heads/master by this push:
     new 857f2bdd3d0 Add ConsistencyCheckJobConfigurationChangedProcessorTest 
and MigrationJobConfigurationChangedProcessorTest (#37099)
857f2bdd3d0 is described below

commit 857f2bdd3d057d4abceec3c945e909070395254c
Author: Liang Zhang <[email protected]>
AuthorDate: Fri Nov 14 18:36:16 2025 +0800

    Add ConsistencyCheckJobConfigurationChangedProcessorTest and 
MigrationJobConfigurationChangedProcessorTest (#37099)
    
    * Add ConsistencyCheckJobConfigurationChangedProcessorTest and 
MigrationJobConfigurationChangedProcessorTest
    
    * Add ConsistencyCheckJobConfigurationChangedProcessorTest and 
MigrationJobConfigurationChangedProcessorTest
---
 AGENTS.md                                          |  10 +-
 ...yCheckJobConfigurationChangedProcessorTest.java |  40 ++++++++
 ...rationJobConfigurationChangedProcessorTest.java | 106 +++++++++++++++++++++
 3 files changed, 154 insertions(+), 2 deletions(-)

diff --git a/AGENTS.md b/AGENTS.md
index 7b97b3ebfc1..d4751083f18 100644
--- a/AGENTS.md
+++ b/AGENTS.md
@@ -47,7 +47,8 @@ Mention which topology you target, the registry used, and any 
compatibility cons
 
 ## AI Execution Workflow
 1. **Intake & Clarify** — restate the ask, map affected modules, confirm 
sandbox/approval/network constraints.
-2. **Plan & Reason** — write a multi-step plan with checkpoints (analysis, 
edits, tests). Align scope with release tempo (prefer incremental fixes unless 
told otherwise). When the user demands precise branch coverage or “minimum 
test” constraints, first enumerate the target branches and map each to the 
single test case that will cover it before touching code.
+   - After clarifying, jot down a “constraint checklist” capturing any 
user-specific rules (forbidden APIs/assertions, output formats, required order 
of operations) plus coverage targets; revisit this list before making edits.
+2. **Plan & Reason** — write a multi-step plan with checkpoints (analysis, 
edits, tests). Align scope with release tempo (prefer incremental fixes unless 
told otherwise). When the user demands precise branch coverage or “minimum 
test” constraints, first enumerate the target branches and map each to the 
single test case that will cover it before touching code, and reply with that 
list (or test plan) for confirmation before modifying files whenever the user 
explicitly asks for it. If the u [...]
 3. **Implement** — touch only necessary files, reuse abstractions, keep ASF 
headers.
 4. **Validate** — choose the smallest meaningful command, announce the intent 
before execution, summarize exit codes afterward; if blocked (sandbox, missing 
deps), explain what would have run and why it matters.
 5. **Report** — lead with intent, list edited files with rationale and line 
references, state verification results, propose next actions.
@@ -67,12 +68,15 @@ Mention which topology you target, the registry used, and 
any compatibility cons
 
 ## Testing Expectations
 - Use JUnit 5 + Mockito; tests mirror production packages, are named 
`ClassNameTest`, and assert via `assertXxxCondition`. Keep Arrange–Act–Assert, 
adding separators only when clarity demands.
+- Before writing code, outline how each branch/scenario will be exercised 
(single test vs combined, data setup strategy) so you implement the intended 
coverage in one pass.
 - Mock databases/time/network; instantiate simple POJOs. Reset static 
caches/guards between cases if production code retains global state.
+- When pipeline/tests require job parameters or data-source configs, prefer 
constructing them via existing swapper/helpers (e.g., 
`YamlMigrationJobConfigurationSwapper`, 
`YamlPipelineDataSourceConfigurationSwapper`) instead of hand-written YAML so 
production parsing paths are exercised.
 - Keep static mocks minimal—only stub SPI/static calls actually reached by the 
scenario to avoid `UnnecessaryStubbingException`.
 - Jacoco workflow: `./mvnw -pl {module} -am -Djacoco.skip=false test 
jacoco:report`, then inspect `{module}/target/site/jacoco/index.html`. 
Aggregator modules require testing concrete submodules before running 
`jacoco:report`. When Jacoco fails, describe uncovered branches and the new 
tests that cover them.
 - Static / constructor mocking: prefer `@ExtendWith(AutoMockExtension.class)` 
with `@StaticMockSettings`/`@ConstructionMockSettings`; avoid manual 
`mockStatic`/`mockConstruction`. Ensure the module `pom.xml` has the 
`shardingsphere-test-infra-framework` test dependency before using these 
annotations.
 - For coverage gating, run `./mvnw test jacoco:check@jacoco-check 
-Pcoverage-check` and report results. If code is truly unreachable, cite 
file/line and explain why, noting whether cleanup is recommended.
 - When a request calls for “minimal branch coverage” or “each branch appears 
only once,” list every branch up front, map each to a single test, and 
explicitly document any uncovered branches (file, line, reason) to avoid 
redundant cases.
+- If the user bans specific assertions/tools (e.g., “don’t use 
`assertEquals`”), add that rule to your test plan, avoid the disallowed API 
during implementation, and run a quick search (e.g., `rg assertEquals`) before 
finishing to ensure compliance.
 
 ### Test Auto-Directives
 When a task requires tests, automatically:
@@ -101,6 +105,7 @@ When a task requires tests, automatically:
 | Sandbox/network block | Command denied due to sandbox/dependency fetch | 
State attempted command + purpose, request approval or alternative plan |
 
 - When touching a single module/class, prefer the narrowest Maven command such 
as `./mvnw -pl <module> -am -Dspotless.skip=true -DskipITs -Dtest=TargetTest 
test` (or an equivalent) for fast feedback, and cite the exact command in the 
report.
+- Avoid running `-Dtest=Pattern` from the repo root unless you are certain the 
selected modules contain matching test classes; otherwise Surefire fails fast. 
If in doubt, run the module’s full suite (preferred) or add 
`-Dsurefire.failIfNoSpecifiedTests=false`.
 
 ## Compatibility, Performance & External Systems
 - **Database/protocol support:** note targeted engines (MySQL 5.7/8.0, 
PostgreSQL 13+, openGauss, etc.) and ensure new behavior stays backward 
compatible; link to affected dialect files.
@@ -170,7 +175,8 @@ When a task requires tests, automatically:
 4. Does run/triage information cite real file paths plus log/config snippets?
 5. Does the report list touched files, verification results, known risks, and 
recommended next steps?
 6. For new or updated tests, did you inspect the target production code paths, 
enumerate the branches being covered, and explain that in your answer?
-7. Before finishing, did you re-check the latest verification command 
succeeded (rerun if needed) so the final state is green?
+7. Have you enforced every user-specific constraint on the checklist (e.g., 
forbidden assertions), including a final search/inspection to confirm 
compliance?
+8. Before finishing, did you re-check the latest verification command 
succeeded (rerun if needed) so the final state is green, and were the commands 
scoped to the smallest necessary modules to avoid redundant reruns?
 
 ## Brevity & Signal
 - Prefer tables/bullets over prose walls; cite file paths (`kernel/src/...`) 
directly.
diff --git 
a/kernel/data-pipeline/scenario/consistency-check/src/test/java/org/apache/shardingsphere/data/pipeline/scenario/consistencycheck/metadata/processor/ConsistencyCheckJobConfigurationChangedProcessorTest.java
 
b/kernel/data-pipeline/scenario/consistency-check/src/test/java/org/apache/shardingsphere/data/pipeline/scenario/consistencycheck/metadata/processor/ConsistencyCheckJobConfigurationChangedProcessorTest.java
new file mode 100644
index 00000000000..bace94df822
--- /dev/null
+++ 
b/kernel/data-pipeline/scenario/consistency-check/src/test/java/org/apache/shardingsphere/data/pipeline/scenario/consistencycheck/metadata/processor/ConsistencyCheckJobConfigurationChangedProcessorTest.java
@@ -0,0 +1,40 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package 
org.apache.shardingsphere.data.pipeline.scenario.consistencycheck.metadata.processor;
+
+import 
org.apache.shardingsphere.data.pipeline.core.metadata.node.config.processor.JobConfigurationChangedProcessor;
+import 
org.apache.shardingsphere.data.pipeline.scenario.consistencycheck.ConsistencyCheckJob;
+import 
org.apache.shardingsphere.data.pipeline.scenario.consistencycheck.config.ConsistencyCheckJobConfiguration;
+import org.apache.shardingsphere.infra.spi.type.typed.TypedSPILoader;
+import org.junit.jupiter.api.Test;
+
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.hamcrest.Matchers.isA;
+import static org.mockito.Mockito.mock;
+
+class ConsistencyCheckJobConfigurationChangedProcessorTest {
+    
+    @SuppressWarnings("rawtypes")
+    private final JobConfigurationChangedProcessor processor = 
TypedSPILoader.getService(JobConfigurationChangedProcessor.class, 
"CONSISTENCY_CHECK");
+    
+    @SuppressWarnings("unchecked")
+    @Test
+    void assertCreateJobAndGetType() {
+        
assertThat(processor.createJob(mock(ConsistencyCheckJobConfiguration.class)), 
isA(ConsistencyCheckJob.class));
+    }
+}
diff --git 
a/kernel/data-pipeline/scenario/migration/core/src/test/java/org/apache/shardingsphere/data/pipeline/scenario/migration/metadata/processor/MigrationJobConfigurationChangedProcessorTest.java
 
b/kernel/data-pipeline/scenario/migration/core/src/test/java/org/apache/shardingsphere/data/pipeline/scenario/migration/metadata/processor/MigrationJobConfigurationChangedProcessorTest.java
new file mode 100644
index 00000000000..e4df8df0217
--- /dev/null
+++ 
b/kernel/data-pipeline/scenario/migration/core/src/test/java/org/apache/shardingsphere/data/pipeline/scenario/migration/metadata/processor/MigrationJobConfigurationChangedProcessorTest.java
@@ -0,0 +1,106 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package 
org.apache.shardingsphere.data.pipeline.scenario.migration.metadata.processor;
+
+import 
org.apache.shardingsphere.data.pipeline.api.PipelineDataSourceConfiguration;
+import 
org.apache.shardingsphere.data.pipeline.api.type.StandardPipelineDataSourceConfiguration;
+import org.apache.shardingsphere.data.pipeline.core.datanode.JobDataNodeEntry;
+import org.apache.shardingsphere.data.pipeline.core.datanode.JobDataNodeLine;
+import 
org.apache.shardingsphere.data.pipeline.core.metadata.node.config.processor.JobConfigurationChangedProcessor;
+import 
org.apache.shardingsphere.data.pipeline.core.preparer.incremental.IncrementalTaskPositionManager;
+import org.apache.shardingsphere.data.pipeline.scenario.migration.MigrationJob;
+import 
org.apache.shardingsphere.data.pipeline.scenario.migration.config.MigrationJobConfiguration;
+import 
org.apache.shardingsphere.data.pipeline.scenario.migration.config.yaml.swapper.YamlMigrationJobConfigurationSwapper;
+import org.apache.shardingsphere.database.connector.core.type.DatabaseType;
+import org.apache.shardingsphere.elasticjob.api.JobConfiguration;
+import org.apache.shardingsphere.infra.datanode.DataNode;
+import org.apache.shardingsphere.infra.spi.type.typed.TypedSPILoader;
+import org.apache.shardingsphere.infra.util.yaml.YamlEngine;
+import org.junit.jupiter.api.Test;
+import org.mockito.MockedConstruction;
+
+import java.sql.SQLException;
+import java.util.Collections;
+import java.util.LinkedHashMap;
+import java.util.Map;
+import java.util.concurrent.atomic.AtomicInteger;
+
+import static org.hamcrest.CoreMatchers.is;
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.hamcrest.Matchers.isA;
+import static org.junit.jupiter.api.Assertions.assertDoesNotThrow;
+import static org.mockito.ArgumentMatchers.any;
+import static org.mockito.ArgumentMatchers.eq;
+import static org.mockito.Mockito.doThrow;
+import static org.mockito.Mockito.mock;
+import static org.mockito.Mockito.mockConstruction;
+import static org.mockito.Mockito.verify;
+import static org.mockito.Mockito.when;
+
+class MigrationJobConfigurationChangedProcessorTest {
+    
+    private final DatabaseType databaseType = 
TypedSPILoader.getService(DatabaseType.class, "Fixture");
+    
+    @SuppressWarnings("rawtypes")
+    private final JobConfigurationChangedProcessor processor = 
TypedSPILoader.getService(JobConfigurationChangedProcessor.class, "MIGRATION");
+    
+    @SuppressWarnings("unchecked")
+    @Test
+    void assertCreateJob() {
+        assertThat(processor.createJob(mock(MigrationJobConfiguration.class)), 
isA(MigrationJob.class));
+    }
+    
+    @Test
+    void assertClean() throws SQLException {
+        JobConfiguration jobConfig = mock(JobConfiguration.class);
+        when(jobConfig.getJobParameter()).thenReturn(createJobParameter());
+        AtomicInteger constructionIndex = new AtomicInteger();
+        try (
+                MockedConstruction<IncrementalTaskPositionManager> 
mockedConstruction = mockConstruction(IncrementalTaskPositionManager.class,
+                        (mock, context) -> {
+                            if (1 == constructionIndex.getAndIncrement()) {
+                                
doThrow(SQLException.class).when(mock).destroyPosition(eq("job-branches"), 
any(PipelineDataSourceConfiguration.class));
+                            }
+                        })) {
+            assertDoesNotThrow(() -> processor.clean(jobConfig));
+            assertThat(mockedConstruction.constructed().size(), is(2));
+            
verify(mockedConstruction.constructed().get(0)).destroyPosition(eq("job-branches"),
 any(PipelineDataSourceConfiguration.class));
+            
verify(mockedConstruction.constructed().get(1)).destroyPosition(eq("job-branches"),
 any(PipelineDataSourceConfiguration.class));
+        }
+    }
+    
+    private String createJobParameter() {
+        Map<String, PipelineDataSourceConfiguration> sources = new 
LinkedHashMap<>(2, 1F);
+        sources.put("ds_0", 
createPipelineDataSourceConfiguration("source_db_0"));
+        sources.put("ds_1", 
createPipelineDataSourceConfiguration("source_db_1"));
+        PipelineDataSourceConfiguration target = 
createPipelineDataSourceConfiguration("target_db");
+        JobDataNodeLine jobDataNodeLine = new 
JobDataNodeLine(Collections.singletonList(new JobDataNodeEntry("t_order", 
Collections.singletonList(new DataNode("ds_0.t_order")))));
+        MigrationJobConfiguration jobConfig = new 
MigrationJobConfiguration("job-branches", "logic_db", databaseType, 
databaseType, sources, target,
+                Collections.singletonList("t_order"), 
Collections.singletonMap("t_order", "public"), jobDataNodeLine, 
Collections.singletonList(jobDataNodeLine), 1, 1);
+        return YamlEngine.marshal(new 
YamlMigrationJobConfigurationSwapper().swapToYamlConfiguration(jobConfig));
+    }
+    
+    private PipelineDataSourceConfiguration 
createPipelineDataSourceConfiguration(final String databaseName) {
+        Map<String, Object> props = new LinkedHashMap<>(4, 1F);
+        props.put("url", String.format("jdbc:mysql://localhost:3306/%s", 
databaseName));
+        props.put("username", "root");
+        props.put("password", "pwd");
+        props.put("dataSourceClassName", "com.zaxxer.hikari.HikariDataSource");
+        return new StandardPipelineDataSourceConfiguration(props);
+    }
+}

Reply via email to