anishshri-db commented on code in PR #50742: URL: https://github.com/apache/spark/pull/50742#discussion_r2067620345
########## sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/StateStore.scala: ########## @@ -565,6 +582,29 @@ trait StateStoreProvider { version: Long, stateStoreCkptId: Option[String] = None): StateStore + /** + * Creates a writable store from an existing read-only store for the specified version. + * + * This method enables an important optimization pattern for stateful operations where + * the same state store needs to be accessed for both reading and writing within a task. + * Instead of opening two separate state store instances (which can cause contention issues), + * this method converts an existing read-only store to a writable store that can commit changes. + * + * This approach is particularly beneficial when: + * - A stateful operation needs to first read the existing state, then update it + * - The state store has locking mechanisms that prevent concurrent access + * - Multiple state store connections would cause unnecessary resource duplication + * + * @param readStore The existing read-only store instance to convert to a writable store + * @param version The version of the state store (must match the read store's version) + * @param uniqueId Optional unique identifier for checkpointing + * @return A writable StateStore instance that can be used to update and commit changes + */ + def getWriteStoreFromReadStore( Review Comment: ni: should we rename as `upgradeReadStoreToWriteStore` ? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org