cloud-fan commented on code in PR #49946:
URL: https://github.com/apache/spark/pull/49946#discussion_r1957015538


##########
launcher/src/main/java/org/apache/spark/launcher/SparkSubmitCommandBuilder.java:
##########
@@ -384,10 +390,9 @@ private List<String> buildPySparkShellCommand(Map<String, 
String> env) throws IO
     if (remoteStr != null) {
       env.put("SPARK_REMOTE", remoteStr);
       env.put("SPARK_CONNECT_MODE_ENABLED", "1");
-    } else if (conf.getOrDefault(
-        SparkLauncher.SPARK_API_MODE, 
"classic").toLowerCase(Locale.ROOT).equals("connect") &&
-        masterStr != null) {
-      env.put("SPARK_REMOTE", masterStr);
+    } else if (isRemote) {
+      // If `removeStr` is not specified but isRemote is true, it means the 
api mode is connect.
+      env.put("MASTER", firstNonEmpty(masterStr, "local"));

Review Comment:
   I think it is the right change: for spark connect api mode, we should 
respect `--master`. However, I tried the same change in my PR and it breaks 
`pyspark` shell. Can you test the `pyspark` shell with `--remote` or with 
`--conf spark.api.mode=connect` to make sure your PR doesn't break it?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to