dongjoon-hyun commented on code in PR #108:
URL: 
https://github.com/apache/spark-connect-swift/pull/108#discussion_r2072448724


##########
Sources/SparkConnect/SparkSession.swift:
##########
@@ -314,6 +314,27 @@ public actor SparkSession {
     await client.clearTags()
   }
 
+  /// Request to interrupt all currently running operations of this session.
+  /// - Returns: Sequence of operation IDs requested to be interrupted.
+  @discardableResult
+  public func interruptAll() async throws -> [String] {
+    return try await client.interruptAll()
+  }
+
+  /// Request to interrupt all currently running operations of this session 
with the given job tag.
+  /// - Returns: Sequence of operation IDs requested to be interrupted.
+  @discardableResult
+  public func interruptTag(_ tag: String) async throws -> [String] {
+    return try await client.interruptTag(tag)
+  }
+
+  /// Request to interrupt an operation of this session, given its operation 
ID.
+  /// - Returns: Sequence of operation IDs requested to be interrupted.
+  @discardableResult
+  public func interruptOperation(_ operationId: String) async throws -> 
[String] {

Review Comment:
   Yes, these Apache Spark 4.0.0 APIs return `Seq[String]` from Spark side 
definition.
   
   ```scala
     def interruptAll(): Seq[String]
     def interruptTag(tag: String): Seq[String]
     def interruptOperation(operationId: String): Seq[String]
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to