mihailom-db commented on code in PR #50967: URL: https://github.com/apache/spark/pull/50967#discussion_r2104012389
########## sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/higherOrderFunctions.scala: ########## @@ -1124,23 +1124,40 @@ case class MapZipWith(left: Expression, right: Expression, function: Expression) private def getKeysWithIndexesFast(keys1: ArrayData, keys2: ArrayData) = { val hashMap = new mutable.LinkedHashMap[Any, Array[Option[Int]]] + val NaNTracker = Array[Option[Int]](None, None) for ((z, array) <- Array((0, keys1), (1, keys2))) { var i = 0 while (i < array.numElements()) { val key = array.get(i, keyType) - hashMap.get(key) match { - case Some(indexes) => - if (indexes(z).isEmpty) { - indexes(z) = Some(i) + keyType match { + case FloatType if key.asInstanceOf[Float].isNaN => + NaNTracker(z) = Some(i) + case DoubleType if key.asInstanceOf[Double].isNaN => + NaNTracker(z) = Some(i) + case _ => + hashMap.get(key) match { + case Some(indexes) => + if (indexes(z).isEmpty) { + indexes(z) = Some(i) + } + case None => + val indexes = Array[Option[Int]](None, None) + indexes(z) = Some(i) + hashMap.put(key, indexes) } - case None => - val indexes = Array[Option[Int]](None, None) - indexes(z) = Some(i) - hashMap.put(key, indexes) Review Comment: The problem is the way we are constructing the indexes. We first input the value from first map, and then check if the value is in the map. The hash map is not using the proper `equals` on NaN values, so we get into the same problem. <img width="1090" alt="Screenshot 2025-05-23 at 09 40 50" src="https://github.com/user-attachments/assets/a29c9f3b-849e-4ea6-945c-e10c5f316f77" /> -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org