morrySnow commented on code in PR #1909:
URL: https://github.com/apache/doris-website/pull/1909#discussion_r1926397529
##########
docs/sql-manual/sql-statements/data-modification/load-and-export/OUTFILE.md:
##########
@@ -26,124 +26,125 @@ under the License.
-->
+
## Description
This statement is used to export query results to a file using the `SELECT
INTO OUTFILE` command. Currently, it supports exporting to remote storage, such
as HDFS, S3, BOS, COS (Tencent Cloud), through the Broker process, S3 protocol,
or HDFS protocol.
-
-#### grammar:
+## Syntax
```sql
-query_stmt
-INTO OUTFILE "file_path"
-[format_as]
-[properties]
+<query_stmt>
+INTO OUTFILE "<file_path>"
+[<format_as>]
+[<properties>]
```
-#### illustrate:
+## Required Parameters
-1. file_path
+**1. `<query_stmt>`**
- file_path points to the path where the file is stored and the file prefix.
Such as `hdfs://path/to/my_file_`.
+ The query statement must be a valid SQL statement. Please refer to the
[query statement documentation](../../data-query/SELECT.md).
- ```
- The final filename will consist of `my_file_`, the file number and the file
format suffix. The file serial number starts from 0, and the number is the
number of files to be divided. Such as:
-
- my_file_abcdefg_0.csv
- my_file_abcdefg_1.csv
- my_file_abcdegf_2.csv
- ```
+**2. `<file_path>`**
- You can also omit the file prefix and specify only the file directory, such
as: `hdfs://path/to/`
+ file_path points to the path where the file is stored and the file prefix.
Such as `hdfs://path/to/my_file_`.
-2. format_as
+ The final filename will consist of `my_file_`, the file number and the file
format suffix. The file serial number starts from 0, and the number is the
number of files to be divided. Such as:
+ - my_file_abcdefg_0.csv
+ - my_file_abcdefg_1.csv
+ - my_file_abcdegf_2.csv
- ```
- FORMAT AS CSV
- ```
+ You can also omit the file prefix and specify only the file directory, such
as: `hdfs://path/to/`
+
+## Optional Parameters
+
+**1. `<format_as>`**
+
+```sql
+FORMAT AS CSV
+```
Specifies the export format. Supported formats include CSV, PARQUET,
CSV_WITH_NAMES, CSV_WITH_NAMES_AND_TYPES and ORC. Default is CSV.
Review Comment:
使用无序列表,列出所有支持的格式
##########
docs/sql-manual/sql-statements/data-modification/load-and-export/OUTFILE.md:
##########
@@ -26,124 +26,125 @@ under the License.
-->
+
## Description
This statement is used to export query results to a file using the `SELECT
INTO OUTFILE` command. Currently, it supports exporting to remote storage, such
as HDFS, S3, BOS, COS (Tencent Cloud), through the Broker process, S3 protocol,
or HDFS protocol.
-
-#### grammar:
+## Syntax
```sql
-query_stmt
-INTO OUTFILE "file_path"
-[format_as]
-[properties]
+<query_stmt>
+INTO OUTFILE "<file_path>"
+[<format_as>]
Review Comment:
```suggestion
[ FORMAT AS <format> ]
```
##########
docs/sql-manual/sql-statements/data-modification/load-and-export/SHOW-EXPORT.md:
##########
@@ -26,95 +26,94 @@ under the License.
+
## Description
-This statement is used to display the execution of the specified export task
+This statement is used to display the execution status of a specified export
job.
-grammar:
+## Syntax
```sql
SHOW EXPORT
-[FROM db_name]
+[FROM <db_name>]
[
WHERE
- [ID=your_job_id]
+ [ID = <job_id>]
[STATE = ["PENDING"|"EXPORTING"|"FINISHED"|"CANCELLED"]]
Review Comment:
```suggestion
[ STATE = { "PENDING" | "EXPORTING" | "FINISHED" | "CANCELLED" } ]
```
##########
docs/sql-manual/sql-statements/data-modification/load-and-export/OUTFILE.md:
##########
@@ -26,124 +26,125 @@ under the License.
-->
+
## Description
This statement is used to export query results to a file using the `SELECT
INTO OUTFILE` command. Currently, it supports exporting to remote storage, such
as HDFS, S3, BOS, COS (Tencent Cloud), through the Broker process, S3 protocol,
or HDFS protocol.
-
-#### grammar:
+## Syntax
```sql
-query_stmt
-INTO OUTFILE "file_path"
-[format_as]
-[properties]
+<query_stmt>
+INTO OUTFILE "<file_path>"
+[<format_as>]
+[<properties>]
```
-#### illustrate:
+## Required Parameters
-1. file_path
+**1. `<query_stmt>`**
- file_path points to the path where the file is stored and the file prefix.
Such as `hdfs://path/to/my_file_`.
+ The query statement must be a valid SQL statement. Please refer to the
[query statement documentation](../../data-query/SELECT.md).
- ```
- The final filename will consist of `my_file_`, the file number and the file
format suffix. The file serial number starts from 0, and the number is the
number of files to be divided. Such as:
-
- my_file_abcdefg_0.csv
- my_file_abcdefg_1.csv
- my_file_abcdegf_2.csv
- ```
+**2. `<file_path>`**
- You can also omit the file prefix and specify only the file directory, such
as: `hdfs://path/to/`
+ file_path points to the path where the file is stored and the file prefix.
Such as `hdfs://path/to/my_file_`.
-2. format_as
+ The final filename will consist of `my_file_`, the file number and the file
format suffix. The file serial number starts from 0, and the number is the
number of files to be divided. Such as:
+ - my_file_abcdefg_0.csv
+ - my_file_abcdefg_1.csv
+ - my_file_abcdegf_2.csv
- ```
- FORMAT AS CSV
- ```
+ You can also omit the file prefix and specify only the file directory, such
as: `hdfs://path/to/`
+
+## Optional Parameters
+
+**1. `<format_as>`**
+
+```sql
+FORMAT AS CSV
+```
Specifies the export format. Supported formats include CSV, PARQUET,
CSV_WITH_NAMES, CSV_WITH_NAMES_AND_TYPES and ORC. Default is CSV.
> Note: PARQUET, CSV_WITH_NAMES, CSV_WITH_NAMES_AND_TYPES, and ORC are
supported starting in version 1.2 .
-3. properties
+**2. `<properties>`**
- Specify related properties. Currently exporting via the Broker process, S3
protocol, or HDFS protocol is supported.
+```sql
+[PROPERTIES ("key"="value", ...)]
Review Comment:
```suggestion
[ PROPERTIES ("<key>"="<value>" [, ... ]) ]
```
##########
docs/sql-manual/sql-statements/data-modification/load-and-export/OUTFILE.md:
##########
@@ -187,105 +188,136 @@ Parquet and ORC file formats have their own data types.
The export function of D
-## Example
-1. Use the broker method to export, and export the simple query results to the
file `hdfs://path/to/result.txt`. Specifies that the export format is CSV. Use
`my_broker` and set kerberos authentication information. Specify the column
separator as `,` and the row separator as `\n`.
+### Export data volume and export efficiency
- ```sql
- SELECT * FROM tbl
- INTO OUTFILE "hdfs://path/to/result_"
- FORMAT AS CSV
- PROPERTIES
- (
- "broker.name" = "my_broker",
- "broker.hadoop.security.authentication" = "kerberos",
- "broker.kerberos_principal" = "[email protected]",
- "broker.kerberos_keytab" = "/home/doris/my.keytab",
- "column_separator" = ",",
- "line_delimiter" = "\n",
- "max_file_size" = "100MB"
- );
- ```
+ This function essentially executes an SQL query command. The final result
is a single-threaded output. Therefore, the time-consuming of the entire export
includes the time-consuming of the query itself and the time-consuming of
writing the final result set. If the query is large, you need to set the
session variable `query_timeout` to appropriately extend the query timeout.
- If the final generated file is not larger than 100MB, it will be:
`result_0.csv`.
- If larger than 100MB, it may be `result_0.csv, result_1.csv, ...`.
+### Management of export files
-2. Export the simple query results to the file
`hdfs://path/to/result.parquet`. Specify the export format as PARQUET. Use
`my_broker` and set kerberos authentication information.
+ Doris does not manage exported files. Including the successful export, or
the remaining files after the export fails, all need to be handled by the user.
+
+### 导出到本地文件
Review Comment:
需要英文标题
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]