This is an automated email from the ASF dual-hosted git repository.
zhangliang pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/shardingsphere.git
The following commit(s) were added to refs/heads/master by this push:
new 323a667010f Add PostgreSQLAggregatedCommandPacketTest (#37129)
323a667010f is described below
commit 323a667010f216ca91df62b0c8fdaf6b1f0061c7
Author: Liang Zhang <[email protected]>
AuthorDate: Mon Nov 17 21:56:48 2025 +0800
Add PostgreSQLAggregatedCommandPacketTest (#37129)
---
CLAUDE.md | 82 ++++++++++++
.../PostgreSQLAggregatedCommandPacketTest.java | 142 +++++++++++++++++++++
2 files changed, 224 insertions(+)
diff --git a/CLAUDE.md b/CLAUDE.md
index d9f638d6e4a..70d69f4ae7b 100644
--- a/CLAUDE.md
+++ b/CLAUDE.md
@@ -132,12 +132,49 @@ Report format:
Let me confirm before skipping coverage requirements for these codes.
```
+**SQL Generation Class Testing**:
+```
+Please write comprehensive tests for [SQLGeneratorClass] in [module],
requirements:
+1. Use efficient test development workflow:
+ - Analyze existing test Mock patterns first (DatabaseTypedSPILoader +
TypedSPILoader)
+ - Write most complex scenario test first to verify feasibility
+ - Run test to get actual SQL output, then correct expected values
+ - Batch copy verified patterns to other simple scenarios
+
+2. SQL syntax verification:
+ - For Oracle: verify MERGE INTO, ROWNUM, NVL syntax formats
+ - For MySQL: verify LIMIT, IFNULL, REPLACE syntax formats
+ - Use database-specific official docs for syntax validation
+
+3. Mock dependency reuse:
+ - 100% reuse existing test Mock configuration patterns
+ - Avoid redesigning Mock chains for SPI loaders
+ - Use DatabaseTypedSPILoader.getService(DialectPipelineSQLBuilder.class,
databaseType) pattern
+
+4. Branch coverage strategy:
+ - Merge simple methods into single tests where possible
+ - Focus independent tests on truly complex conditional branches
+ - Ensure each boolean/enum branch has at least one dedicated test
+
+5. Quality assurance:
+ - Run complete test suite in one batch after all tests written
+ - Use Jacoco to verify 100% branch coverage
+ - Minimize iterative modifications during development
+```
+
#### 3. Key Points for Best Results
- **Provide complete context**: Tell me the specific class path, module
information, and related dependencies
- **Clarify complexity**: If the class is particularly complex, specify which
part to test first
- **Progressive development**: For complex features, request step-by-step
implementation
- **Quality verification**: Ask me to run actual test commands to verify pass
rates
+#### 4. Critical Success Factors for Testing Tasks
+- **Pattern Analysis First**: Always analyze existing test patterns before
writing new tests
+- **Validation-Driven Development**: Write tests to discover actual behavior
before defining expectations
+- **Mock Reuse Principle**: Never redesign Mock configurations when existing
patterns work
+- **Branch Efficiency**: Focus on conditional branches, not method count, for
test coverage
+- **Batch Validation**: Run complete test suites at once, avoid frequent
incremental checks
+
## 🤖 AI Usage Guidelines
### Core Principles
@@ -415,12 +452,37 @@ public final class ${SPIName}Impl implements
${SPIName}SPI {
6. **Complete Validation** → Ensure all quality checks pass
### Test Task Steps
+
+#### Standard Testing Workflow
1. **Analyze Test Scenarios** → Identify branches that need testing
2. **Mock Configuration** → Use Mock Configuration Template
3. **Write Tests** → Apply Test Method Template
4. **Verify Coverage** → Ensure complete branch coverage
5. **Assertion Validation** → Use correct assertion patterns
+#### SQL Generation Class Workflow (High-Efficiency)
+**Phase 1: Comprehensive Analysis (One-time)**
+```
+Task agent analysis should include:
+- Complete class structure and method listing
+- Complexity and branch count for each method
+- All existing test patterns and configurations
+- Dependency relationships and Mock requirements
+- Identification of any non-coverable code
+```
+
+**Phase 2: Validation-First Development**
+1. **Select most complex scenario** and write one test first
+2. **Verify Mock configuration and syntax expectations** by running the test
+3. **Batch copy verified patterns** to other simple scenarios
+4. **Write dedicated tests only for truly complex conditional branches**
+
+**Phase 3: Quality Assurance**
+- Run complete test suite and coverage checks in one batch
+- Avoid frequent iterative modifications
+- Use Jacoco to verify 100% branch coverage
+```
+
### Documentation Task Steps
1. **Content Review** → Check accuracy and formatting
2. **Link Validation** → Ensure all links are valid
@@ -544,6 +606,26 @@ error_recovery_index:
## 🛠️ Common Issues & Solutions
+### SQL Generation Class Issues
+
+#### Expected SQL Syntax Errors
+- **Issue**: Expected SQL assertion doesn't match actual generated SQL
+- **Solution**: Run test first to get actual output, then correct expected
values
+- **Prevention**: Use database official documentation to verify syntax formats
+- **Oracle Common Issues**: MERGE INTO ON clause format, ROWNUM positioning,
NVL parameter order
+
+#### Mock Configuration Complexity
+- **Issue**: Over-engineering Mock configurations for SPI loaders
+- **Solution**: 100% reuse existing test Mock patterns instead of redesigning
+- **Pattern**: Use
`DatabaseTypedSPILoader.getService(DialectPipelineSQLBuilder.class,
databaseType)` + `TypedSPILoader.getService(DatabaseType.class, "Oracle")`
combination
+- **Prevention**: Analyze existing tests completely before writing new ones
+
+#### Branch Coverage Strategy Inefficiency
+- **Issue**: Writing too many granular tests for simple methods
+- **Solution**: Merge simple method tests, focus independent tests on complex
branches only
+- **Guideline**: One test per conditional branch, not one test per method
+- **Example**: Test all simple SQL formatting methods in one focused test
+
### Coverage Problems
- **Issue**: Mock configuration incomplete, branches not executed
- **Solution**: Use MockedConstruction, create dedicated test methods for each
branch
diff --git
a/database/protocol/dialect/postgresql/src/test/java/org/apache/shardingsphere/database/protocol/postgresql/packet/command/query/extended/PostgreSQLAggregatedCommandPacketTest.java
b/database/protocol/dialect/postgresql/src/test/java/org/apache/shardingsphere/database/protocol/postgresql/packet/command/query/extended/PostgreSQLAggregatedCommandPacketTest.java
new file mode 100644
index 00000000000..1af164c64a8
--- /dev/null
+++
b/database/protocol/dialect/postgresql/src/test/java/org/apache/shardingsphere/database/protocol/postgresql/packet/command/query/extended/PostgreSQLAggregatedCommandPacketTest.java
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package
org.apache.shardingsphere.database.protocol.postgresql.packet.command.query.extended;
+
+import
org.apache.shardingsphere.database.protocol.postgresql.packet.command.query.extended.bind.PostgreSQLComBindPacket;
+import
org.apache.shardingsphere.database.protocol.postgresql.packet.command.query.extended.execute.PostgreSQLComExecutePacket;
+import
org.apache.shardingsphere.database.protocol.postgresql.packet.command.query.extended.parse.PostgreSQLComParsePacket;
+import org.junit.jupiter.api.Test;
+import org.junit.jupiter.api.extension.ExtendWith;
+import org.mockito.Mock;
+import org.mockito.junit.jupiter.MockitoExtension;
+
+import java.util.Arrays;
+
+import static org.hamcrest.CoreMatchers.is;
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.junit.jupiter.api.Assertions.assertDoesNotThrow;
+import static org.junit.jupiter.api.Assertions.assertFalse;
+import static org.junit.jupiter.api.Assertions.assertTrue;
+import static org.mockito.Mockito.mock;
+import static org.mockito.Mockito.when;
+
+@ExtendWith(MockitoExtension.class)
+class PostgreSQLAggregatedCommandPacketTest {
+
+ @Mock
+ private PostgreSQLComParsePacket parsePacket;
+
+ @Mock
+ private PostgreSQLComBindPacket bindPacket;
+
+ @Mock
+ private PostgreSQLComExecutePacket executePacket;
+
+ @Test
+ void assertNewInstanceWithMultipleParsePackets() {
+ when(parsePacket.getStatementId()).thenReturn("parse_statement");
+ PostgreSQLAggregatedCommandPacket actual = new
PostgreSQLAggregatedCommandPacket(Arrays.asList(parsePacket,
mock(PostgreSQLComParsePacket.class)));
+ assertThat(actual.getPackets().size(), is(2));
+ assertFalse(actual.isContainsBatchedStatements());
+ assertThat(actual.getBatchPacketBeginIndex(), is(-1));
+ assertThat(actual.getBatchPacketEndIndex(), is(-1));
+ }
+
+ @Test
+ void assertNewInstanceWithDifferentStatementIdsInParseAndBind() {
+ when(parsePacket.getStatementId()).thenReturn("parse_statement");
+ when(bindPacket.getStatementId()).thenReturn("bind_statement");
+ PostgreSQLAggregatedCommandPacket actual = new
PostgreSQLAggregatedCommandPacket(Arrays.asList(parsePacket, bindPacket));
+ assertThat(actual.getPackets().size(), is(2));
+ assertFalse(actual.isContainsBatchedStatements());
+ assertThat(actual.getBatchPacketBeginIndex(), is(1));
+ assertThat(actual.getBatchPacketEndIndex(), is(-1));
+ }
+
+ @Test
+ void assertNewInstanceWithDifferentPortalsInBindAndExecute() {
+ when(bindPacket.getStatementId()).thenReturn("parse_statement");
+ when(bindPacket.getPortal()).thenReturn("portal_1");
+ when(executePacket.getPortal()).thenReturn("portal_2");
+ PostgreSQLAggregatedCommandPacket actual = new
PostgreSQLAggregatedCommandPacket(Arrays.asList(bindPacket, executePacket));
+ assertThat(actual.getPackets().size(), is(2));
+ assertFalse(actual.isContainsBatchedStatements());
+ assertThat(actual.getBatchPacketBeginIndex(), is(0));
+ assertThat(actual.getBatchPacketEndIndex(), is(1));
+ }
+
+ @Test
+ void assertNewInstanceWithValidBatchPackets() {
+ when(parsePacket.getStatementId()).thenReturn("parse_statement");
+ when(bindPacket.getStatementId()).thenReturn("parse_statement");
+ when(bindPacket.getPortal()).thenReturn("portal_1");
+ when(executePacket.getPortal()).thenReturn("portal_1");
+ PostgreSQLComBindPacket mockBindPacket2 =
mock(PostgreSQLComBindPacket.class);
+ PostgreSQLComBindPacket mockBindPacket3 =
mock(PostgreSQLComBindPacket.class);
+ PostgreSQLComExecutePacket mockExecutePacket2 =
mock(PostgreSQLComExecutePacket.class);
+ PostgreSQLComExecutePacket mockExecutePacket3 =
mock(PostgreSQLComExecutePacket.class);
+ when(mockBindPacket2.getStatementId()).thenReturn("parse_statement");
+ when(mockBindPacket2.getPortal()).thenReturn("portal_1");
+ when(mockBindPacket3.getStatementId()).thenReturn("parse_statement");
+ when(mockBindPacket3.getPortal()).thenReturn("portal_1");
+ when(mockExecutePacket2.getPortal()).thenReturn("portal_1");
+ when(mockExecutePacket3.getPortal()).thenReturn("portal_1");
+ PostgreSQLAggregatedCommandPacket actual = new
PostgreSQLAggregatedCommandPacket(
+ Arrays.asList(parsePacket, bindPacket, mockBindPacket2,
mockBindPacket3, executePacket, mockExecutePacket2, mockExecutePacket3));
+ assertThat(actual.getPackets().size(), is(7));
+ assertTrue(actual.isContainsBatchedStatements());
+ assertThat(actual.getBatchPacketBeginIndex(), is(1));
+ assertThat(actual.getBatchPacketEndIndex(), is(6));
+ }
+
+ @Test
+ void assertNewInstanceWithDifferentPortalsInBinds() {
+ when(bindPacket.getStatementId()).thenReturn("parse_statement");
+ when(bindPacket.getPortal()).thenReturn("portal_1");
+ PostgreSQLComBindPacket mockBindPacket2 =
mock(PostgreSQLComBindPacket.class);
+ when(mockBindPacket2.getStatementId()).thenReturn("parse_statement");
+ when(mockBindPacket2.getPortal()).thenReturn("portal_2");
+ PostgreSQLAggregatedCommandPacket actual = new
PostgreSQLAggregatedCommandPacket(Arrays.asList(bindPacket, mockBindPacket2));
+ assertThat(actual.getPackets().size(), is(2));
+ assertFalse(actual.isContainsBatchedStatements());
+ assertThat(actual.getBatchPacketBeginIndex(), is(0));
+ assertThat(actual.getBatchPacketEndIndex(), is(-1));
+ }
+
+ @Test
+ void assertNewInstanceWithBindThenParseWithSameStatementId() {
+ when(bindPacket.getStatementId()).thenReturn("parse_statement");
+ when(bindPacket.getPortal()).thenReturn("portal_1");
+ when(parsePacket.getStatementId()).thenReturn("parse_statement");
+ PostgreSQLAggregatedCommandPacket actual = new
PostgreSQLAggregatedCommandPacket(Arrays.asList(bindPacket, parsePacket));
+ assertThat(actual.getPackets().size(), is(2));
+ assertFalse(actual.isContainsBatchedStatements());
+ assertThat(actual.getBatchPacketBeginIndex(), is(0));
+ assertThat(actual.getBatchPacketEndIndex(), is(-1));
+ }
+
+ @Test
+ void assertWrite() {
+ assertDoesNotThrow(() -> new
PostgreSQLAggregatedCommandPacket(Arrays.asList(bindPacket,
parsePacket)).write(mock()));
+ }
+
+ @Test
+ void assertGetIdentifier() {
+ assertThat(new
PostgreSQLAggregatedCommandPacket(Arrays.asList(bindPacket,
parsePacket)).getIdentifier().getValue(), is('?'));
+ }
+}