kaxil commented on code in PR #61077:
URL: https://github.com/apache/airflow/pull/61077#discussion_r2886825571


##########
airflow-core/tests/unit/cli/commands/test_dag_command.py:
##########
@@ -976,6 +977,7 @@ class TestCliDagsReserialize:
         "bundle1": TEST_DAGS_FOLDER / "test_example_bash_operator.py",
         "bundle2": TEST_DAGS_FOLDER / "test_sensor.py",
         "bundle3": TEST_DAGS_FOLDER / "test_dag_with_no_tags.py",
+        "bundle4": TEST_DAGS_FOLDER / "test_dag_reserialize.py",

Review Comment:
   Adding `bundle4` to the shared class-level `test_bundles_config` means 
`test_reserialize` now also needs to assert on the new DAG. This couples the 
tests — if the test DAG file is renamed or removed, two tests break instead of 
one.
   
   Consider using a local bundles config in 
`test_reserialize_should_make_equal_hash_with_dag_processor` instead:
   
   ```python
   def test_reserialize_should_make_equal_hash_with_dag_processor(self, 
configure_dag_bundles, session):
       bundles = {"bundle_reserialize": TEST_DAGS_FOLDER / 
"test_dag_reserialize.py"}
       with configure_dag_bundles(bundles):
           dag_command.dag_reserialize(
               self.parser.parse_args(["dags", "reserialize", "--bundle-name", 
"bundle_reserialize"])
           )
       ...
   ```
   
   That way you can also revert the assertion change on line 997-1001.



##########
airflow-core/tests/unit/cli/commands/test_dag_command.py:
##########
@@ -25,6 +25,7 @@
 from unittest import mock
 from unittest.mock import MagicMock
 
+import msgspec
 import pendulum

Review Comment:
   nit: This import is only used in 
`test_reserialize_should_make_equal_hash_with_dag_processor`. Consider moving 
it into the test method or into a narrower scope if you want to keep the 
module-level imports clean for this file.



##########
airflow-core/tests/unit/dags/test_dag_reserialize.py:
##########
@@ -0,0 +1,37 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from datetime import datetime
+from time import sleep
+
+from airflow.providers.standard.operators.python import PythonOperator
+from airflow.sdk import DAG
+
+
+def sleep_task():
+    sleep(1)
+

Review Comment:
   The `sleep(1)` serves no purpose for testing serialization hash consistency. 
A no-op would suffice:
   
   ```suggestion
   def noop_task():
       pass
   ```
   
   (and drop the `from time import sleep` import)



##########
airflow-core/tests/unit/cli/commands/test_dag_command.py:
##########
@@ -1020,3 +1027,32 @@ def 
test_reserialize_should_support_multiple_bundle_name_arguments(self, configu
 
         serialized_dag_ids = 
set(session.execute(select(SerializedDagModel.dag_id)).scalars())
         assert serialized_dag_ids == {"test_example_bash_operator", 
"test_sensor"}
+
+    @conf_vars({("core", "load_examples"): "false"})
+    def test_reserialize_should_make_equal_hash_with_dag_processor(self, 
configure_dag_bundles, session):
+        from airflow.dag_processing.processor import DagFileParsingResult, 
DagFileProcessorProcess
+        from airflow.sdk.execution_time.comms import _RequestFrame, 
_ResponseFrame
+        from airflow.serialization.serialized_objects import DagSerialization, 
LazyDeserializedDAG
+
+        with configure_dag_bundles(self.test_bundles_config):
+            dag_command.dag_reserialize(
+                self.parser.parse_args(["dags", "reserialize", 
"--bundle-name", "bundle4"])
+            )
+

Review Comment:
   This test manually reconstructs the msgpack frame protocol 
(`_ResponseFrame`, `_RequestFrame`, frame slicing) to simulate what the dag 
processor does. This works, but it's tightly coupled to internal comms protocol 
details.
   
   A simpler unit test on the hash function directly would be more robust:
   
   ```python
   def test_sort_handles_tuples_same_as_lists(self):
       data_with_tuple = {"dag": {"task_group": {"children": {"bear": 
("operator", "bear")}}}}
       data_with_list = {"dag": {"task_group": {"children": {"bear": 
["operator", "bear"]}}}}
       assert SerializedDagModel.hash(data_with_tuple) == 
SerializedDagModel.hash(data_with_list)
   ```
   
   That said, if you want to keep the integration-style test to prove the exact 
production scenario end-to-end, that's reasonable too — just noting the 
trade-off.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to