harshmotw-db commented on code in PR #49450:
URL: https://github.com/apache/spark/pull/49450#discussion_r1912748974


##########
python/pyspark/sql/tests/test_types.py:
##########
@@ -2240,6 +2240,11 @@ def test_variant_type(self):
             PySparkValueError, lambda: str(VariantVal(bytes([32, 10, 1, 0, 0, 
0]), metadata))
         )
 
+        # check parse_json
+        for key, json_str, obj in expected_values:
+            self.assertEqual(VariantVal.parseJson(json_str).toJson(), json_str)
+            self.assertEqual(VariantVal.parseJson(json_str).toPython(), obj)

Review Comment:
   ```suggestion
           for key, json, obj in expected_values:
               self.assertEqual(VariantVal.parseJson(json).toJson(), json)
               self.assertEqual(VariantVal.parseJson(json).toPython(), obj)
   
           parse_json_spark_output = variants[0]
           parse_json_python_output = VariantVal.parseJson(json_str)
           self.assertEqual(parse_json_spark_output.value, 
parse_json_python_output.value)
           self.assertEqual(parse_json_spark_output.metadata, 
parse_json_python_output.metadata)
   ```
   
   This added test also confirms that the Variant binaries generated by the 
Python parse_json match with the Spark Scala implementation. I've confirmed on 
local that it passes (and fails without the `apend_int` fix). Note that 
`json_str` is defined above contains all the expected_values values.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to