This is an automated email from the ASF dual-hosted git repository.

jiayu pushed a commit to branch branch-1.6.1
in repository https://gitbox.apache.org/repos/asf/sedona.git

commit a66d4e7cbb4f0cab8527c9dc28ab9a14b34a65e1
Author: Kristin Cowalcijk <[email protected]>
AuthorDate: Sat Aug 31 10:57:48 2024 +0800

    [DOCS] Fix minor issues of Databricks setup guide based on my recent 
experience (#1568)
---
 docs/setup/databricks.md | 9 ++++++---
 1 file changed, 6 insertions(+), 3 deletions(-)

diff --git a/docs/setup/databricks.md b/docs/setup/databricks.md
index 431430e82..77d3eb518 100644
--- a/docs/setup/databricks.md
+++ b/docs/setup/databricks.md
@@ -48,6 +48,9 @@ In Spark 3.2, 
`org.apache.spark.sql.catalyst.expressions.Generator` class added
 * ST_MakeValid
 * ST_SubDivideExplode
 
+!!!note
+    The following steps use DBR including Apache Spark 3.4.x as an example. 
Please change the Spark version according to your DBR version.
+
 ### Download Sedona jars
 
 Download the Sedona jars to a DBFS location. You can do that manually via UI 
or from a notebook by executing this code in a cell:
@@ -102,9 +105,9 @@ spark.kryo.registrator 
org.apache.sedona.core.serde.SedonaKryoRegistrator
 
 From your cluster configuration (`Cluster` -> `Edit` -> `Configuration` -> 
`Advanced options` -> `Init Scripts`) add the newly created `Workspace` init 
script
 
-```
-/Workspace/sedona/sedona-init.sh
-```
+| Type | File path |
+|------|-----------|
+| Workspace | /Shared/sedona/sedona-init.sh |
 
 For enabling python support, from the Libraries tab install from PyPI
 

Reply via email to