joan38 commented on code in PR #50474: URL: https://github.com/apache/spark/pull/50474#discussion_r2024489759
########## core/src/main/scala/org/apache/spark/storage/ShuffleBlockFetcherIterator.scala: ########## @@ -1014,39 +1014,44 @@ final class ShuffleBlockFetcherIterator( // a SuccessFetchResult or a FailureFetchResult. result = null - case PushMergedLocalMetaFetchResult( - shuffleId, shuffleMergeId, reduceId, bitmaps, localDirs) => - // Fetch push-merged-local shuffle block data as multiple shuffle chunks - val shuffleBlockId = ShuffleMergedBlockId(shuffleId, shuffleMergeId, reduceId) - try { - val bufs: Seq[ManagedBuffer] = blockManager.getLocalMergedBlockData(shuffleBlockId, - localDirs) - // Since the request for local block meta completed successfully, numBlocksToFetch - // is decremented. - numBlocksToFetch -= 1 - // Update total number of blocks to fetch, reflecting the multiple local shuffle - // chunks. - numBlocksToFetch += bufs.size - bufs.zipWithIndex.foreach { case (buf, chunkId) => - buf.retain() - val shuffleChunkId = ShuffleBlockChunkId(shuffleId, shuffleMergeId, reduceId, - chunkId) - pushBasedFetchHelper.addChunk(shuffleChunkId, bitmaps(chunkId)) - results.put(SuccessFetchResult(shuffleChunkId, SHUFFLE_PUSH_MAP_ID, - pushBasedFetchHelper.localShuffleMergerBlockMgrId, buf.size(), buf, - isNetworkReqDone = false)) - } - } catch { - case e: Exception => - // If we see an exception with reading push-merged-local index file, we fallback - // to fetch the original blocks. We do not report block fetch failure - // and will continue with the remaining local block read. - logWarning("Error occurred while reading push-merged-local index, " + - "prepare to fetch the original blocks", e) - pushBasedFetchHelper.initiateFallbackFetchForPushMergedBlock( - shuffleBlockId, pushBasedFetchHelper.localShuffleMergerBlockMgrId) + case PushMergedLocalMetaFetchResult( Review Comment: @LuciferYang before involving more people I played a bit more with the compiler options and came to some more conclusions: * With `-Xsource:3` and `-Xsource-features:v2.13.15,-case-companion-function` I am able to exclude the "feature" with case classes' companion objects not implementing `FunctionN`. That way, we are keeping binary compatibility and we can break that next time we eventually cross compile Scala 3. * Now there is actually another change which applies the constructor modifiers of case classes to the `apply` (`tupled` and `curried` too). This means a case class like `org.apache.spark.paths.SparkPath`: ``` case class SparkPath private (private val underlying: String) ``` will not have any `def apply` in the companion object, whereas so far we did have one, that was accessible publicly. Was this intentional? I don't think so. I pushed a commit where I went ahead and added the MiMa exceptions (please review them). However if you tell me "No we cannot break binary compatibility here and we need to let people create SparkPath via the apply function", then another solution is to make the constructor public. A 3rd solution is to keep the constructor private and write our own `public def apply` to make sure we don't break binary compatibility. Let me know -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org