This is an automated email from the ASF dual-hosted git repository.

wyf pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-doris.git


The following commit(s) were added to refs/heads/master by this push:
     new 9d4e6d8  [Spark-Doris-Connector] fixed some spark-doris-connector doc 
typo
9d4e6d8 is described below

commit 9d4e6d8362f1c65190076865982015e00adb7030
Author: luzhijing <82810928+luzhij...@users.noreply.github.com>
AuthorDate: Tue Oct 26 18:23:53 2021 +0800

    [Spark-Doris-Connector] fixed some spark-doris-connector doc typo
---
 docs/zh-CN/extending-doris/spark-doris-connector.md | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/zh-CN/extending-doris/spark-doris-connector.md 
b/docs/zh-CN/extending-doris/spark-doris-connector.md
index 92d13d3..3ede5ac 100644
--- a/docs/zh-CN/extending-doris/spark-doris-connector.md
+++ b/docs/zh-CN/extending-doris/spark-doris-connector.md
@@ -52,7 +52,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 
 ```bash
 sh build.sh 3  ## spark 3.x版本,默认是3.1.2
-sh build.sh 2  ## soark 2.x版本,默认是2.3.4
+sh build.sh 2  ## spark 2.x版本,默认是2.3.4
 ```
 
 编译成功后,会在 `output/` 目录下生成文件 `doris-spark-1.0.0-SNAPSHOT.jar`。将此文件复制到 `Spark` 的 
`ClassPath` 中即可使用 `Spark-Doris-Connector`。例如,`Local` 模式运行的 `Spark`,将此文件放入 
`jars/` 文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
@@ -221,4 +221,4 @@ kafkaSource.selectExpr("CAST(key AS STRING)", "CAST(value 
as STRING)")
 | TIME       | DataTypes.DoubleType             |
 | HLL        | Unsupported datatype             |
 
-* 
注:Connector中,将`DATE`和`DATETIME`映射为`String`。由于`Doris`底层存储引擎处理逻辑,直接使用时间类型时,覆盖的时间范围无法满足需求。所以使用
 `String` 类型直接返回对应的时间可读文本。
\ No newline at end of file
+* 
注:Connector中,将`DATE`和`DATETIME`映射为`String`。由于`Doris`底层存储引擎处理逻辑,直接使用时间类型时,覆盖的时间范围无法满足需求。所以使用
 `String` 类型直接返回对应的时间可读文本。

---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org

Reply via email to