cloud-fan commented on code in PR #49107:
URL: https://github.com/apache/spark/pull/49107#discussion_r1942368959


##########
docs/app-dev-spark-connect.md:
##########
@@ -58,6 +58,59 @@ to the client via the blue box as part of the Spark Connect 
API. The client uses
 alongside PySpark or the Spark Scala client, making it easy for Spark client 
applications to work
 with the custom logic/library. 
 
+## Spark API Mode: Spark Client and Spark Classic
+
+Spark provides API mode (`spark.api.mode`) configuration that can seamlessly 
let Spark
+applications use Spark Connect. Based on `spark.api.mode` configuration, the 
application either
+be executed in Spark Classic or Spark Connect mode. See the examples below:
+
+{% highlight python %}
+from pyspark.sql import SparkSession
+
+SparkSession.builder.config("spark.api.mode", 
"connect").master("...").getOrCreate()
+{% endhighlight %}
+
+
+This configuration can be also effectively set for both Scala Spark 
application and PySpark
+applications via Spark submission:
+
+{% highlight bash %}
+spark-submit --master "..." --conf spark.api.mode=connect
+{% endhighlight %}
+
+In addition, Spark Connect provides convient special cases for you to locally 
test. `spark.remote`
+can be set as `local[...]` and `local-cluster[...]`. In this case, it creates 
a locally running
+Spark Connect server, and provide users the Spark Connect session. This is 
similar when
+`--conf spark.api.mode=connect` and `--master ...` are set together. However 
`spark.remote`
+and `--remote` only supports `local*` whereas `--conf spark.api.mode=connect` 
and `--master ...`
+allow other Spark cluster URLs such as `spark://` for better compatibility.
+
+Spark provides the API mode, `spark.api.mode` configuration, enabling Spark 
Classic applications
+to seamlessly switch to Spark Connect. Depending on the value of 
`spark.api.mode`, the application
+can run in either Spark Classic or Spark Connect mode. Here is an example:
+
+{% highlight python %}
+from pyspark.sql import SparkSession
+
+SparkSession.builder.config("spark.api.mode", 
"connect").master("...").getOrCreate()
+{% endhighlight %}
+
+You can also apply this configuration to both Scala and PySpark applications 
when submitting yours:
+
+{% highlight bash %}
+spark-submit --master "..." --conf spark.api.mode=connect

Review Comment:
   some contents seem to be duplicated.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to