This is an automated email from the ASF dual-hosted git repository.

kassiez pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris-website.git


The following commit(s) were added to refs/heads/master by this push:
     new 2babf360221 [fix] make create table and load data procedure easily 
(#1603)
2babf360221 is described below

commit 2babf36022152ebebacdde38f243b4518bcc7612
Author: feiniaofeiafei <moail...@selectdb.com>
AuthorDate: Fri Dec 27 19:27:33 2024 +0800

    [fix] make create table and load data procedure easily (#1603)
    
    ## Versions
    
    - [x] dev
    - [x] 3.0
    - [x] 2.1
    - [ ] 2.0
    
    ## Languages
    
    - [x] Chinese
    - [x] English
    
    ## Docs Checklist
    
    - [ ] Checked by AI
    - [ ] Test Cases Built
---
 docs/query-data/window-function.md                    | 19 ++++++++++++++-----
 .../current/query-data/window-function.md             | 19 ++++++++++++++-----
 .../version-2.1/query-data/window-function.md         | 19 ++++++++++++++-----
 .../version-3.0/query-data/window-function.md         | 19 ++++++++++++++-----
 .../version-2.1/query-data/window-function.md         | 19 ++++++++++++++-----
 .../version-3.0/query-data/window-function.md         | 19 ++++++++++++++-----
 6 files changed, 84 insertions(+), 30 deletions(-)

diff --git a/docs/query-data/window-function.md 
b/docs/query-data/window-function.md
index d1b09006f26..5df4ab55ede 100644
--- a/docs/query-data/window-function.md
+++ b/docs/query-data/window-function.md
@@ -583,9 +583,12 @@ For more information on analytic functions, refer to the 
Oracle official documen
 
 ## Reference
 
-Create table and load data :
+The table creation statement used in the example is as follows:
 
 ```sql
+CREATE DATABASE IF NOT EXISTS doc_tpcds;
+USE doc_tpcds;
+
 CREATE TABLE IF NOT EXISTS item (
     i_item_sk bigint not null,
     i_item_id char(16) not null,
@@ -703,33 +706,39 @@ DISTRIBUTED BY HASH(ca_address_sk) BUCKETS 12
 PROPERTIES (
   "replication_num" = "1"
 );
+```
+
+Execute the following command on the terminal to download the data to the 
local computer and load the data into the table using the Stream Load method:
+
+```shell
+curl -L https://cdn.selectdb.com/static/doc_ddl_dir_d27a752a7b.tar -o - | tar 
-Jxf -
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: i_item_sk, i_item_id, i_rec_start_date, i_rec_end_date, 
i_item_desc, i_current_price, i_wholesale_cost, i_brand_id, i_brand, 
i_class_id, i_class, i_category_id, i_category, i_manufact_id, i_manufact, 
i_size, i_formulation, i_color, i_units, i_container, i_manager_id, 
i_product_name" \
--T "/path/to/data/item_1_10.dat" \
+-T "doc_ddl_dir/item_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/item/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: d_date_sk, d_date_id, d_date, d_month_seq, d_week_seq, 
d_quarter_seq, d_year, d_dow, d_moy, d_dom, d_qoy, d_fy_year, d_fy_quarter_seq, 
d_fy_week_seq, d_day_name, d_quarter_name, d_holiday, d_weekend, 
d_following_holiday, d_first_dom, d_last_dom, d_same_day_ly, d_same_day_lq, 
d_current_day, d_current_week, d_current_month, d_current_quarter, 
d_current_year" \
--T "/path/to/data/date_dim_1_10.dat" \
+-T "doc_ddl_dir/date_dim_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/date_dim/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: ss_sold_date_sk, ss_sold_time_sk, ss_item_sk, ss_customer_sk, 
ss_cdemo_sk, ss_hdemo_sk, ss_addr_sk, ss_store_sk, ss_promo_sk, 
ss_ticket_number, ss_quantity, ss_wholesale_cost, ss_list_price, 
ss_sales_price, ss_ext_discount_amt, ss_ext_sales_price, ss_ext_wholesale_cost, 
ss_ext_list_price, ss_ext_tax, ss_coupon_amt, ss_net_paid, ss_net_paid_inc_tax, 
ss_net_profit" \
--T "/path/to/data/store_sales.csv" \
+-T "doc_ddl_dir/store_sales.csv" \
 http://127.0.0.1:8030/api/doc_tpcds/store_sales/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "ca_address_sk, ca_address_id, ca_street_number, ca_street_name, 
ca_street_type, ca_suite_number, ca_city, ca_county, ca_state, ca_zip, 
ca_country, ca_gmt_offset, ca_location_type" \
--T "/path/to/data/customer_address_1_10.dat" \
+-T "doc_ddl_dir/customer_address_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/customer_address/_stream_load
 ```
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/query-data/window-function.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/query-data/window-function.md
index a74ab33daf4..175b7b472b1 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/query-data/window-function.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/query-data/window-function.md
@@ -575,9 +575,12 @@ FROM
 
 ## 附录
 
-建表和加载数据:
+示例中使用到的表的建表语句如下:
 
 ```sql
+CREATE DATABASE IF NOT EXISTS doc_tpcds;
+USE doc_tpcds;
+
 CREATE TABLE IF NOT EXISTS item (
     i_item_sk bigint not null,
     i_item_id char(16) not null,
@@ -695,33 +698,39 @@ DISTRIBUTED BY HASH(ca_address_sk) BUCKETS 12
 PROPERTIES (
   "replication_num" = "1"
 );
+```
+
+在终端执行如下命令,下载数据到本地,并使用Stream Load的方式加载数据:
+
+```shell
+curl -L https://cdn.selectdb.com/static/doc_ddl_dir_d27a752a7b.tar -o - | tar 
-Jxf -
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: i_item_sk, i_item_id, i_rec_start_date, i_rec_end_date, 
i_item_desc, i_current_price, i_wholesale_cost, i_brand_id, i_brand, 
i_class_id, i_class, i_category_id, i_category, i_manufact_id, i_manufact, 
i_size, i_formulation, i_color, i_units, i_container, i_manager_id, 
i_product_name" \
--T "/path/to/data/item_1_10.dat" \
+-T "doc_ddl_dir/item_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/item/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: d_date_sk, d_date_id, d_date, d_month_seq, d_week_seq, 
d_quarter_seq, d_year, d_dow, d_moy, d_dom, d_qoy, d_fy_year, d_fy_quarter_seq, 
d_fy_week_seq, d_day_name, d_quarter_name, d_holiday, d_weekend, 
d_following_holiday, d_first_dom, d_last_dom, d_same_day_ly, d_same_day_lq, 
d_current_day, d_current_week, d_current_month, d_current_quarter, 
d_current_year" \
--T "/path/to/data/date_dim_1_10.dat" \
+-T "doc_ddl_dir/date_dim_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/date_dim/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: ss_sold_date_sk, ss_sold_time_sk, ss_item_sk, ss_customer_sk, 
ss_cdemo_sk, ss_hdemo_sk, ss_addr_sk, ss_store_sk, ss_promo_sk, 
ss_ticket_number, ss_quantity, ss_wholesale_cost, ss_list_price, 
ss_sales_price, ss_ext_discount_amt, ss_ext_sales_price, ss_ext_wholesale_cost, 
ss_ext_list_price, ss_ext_tax, ss_coupon_amt, ss_net_paid, ss_net_paid_inc_tax, 
ss_net_profit" \
--T "/path/to/data/store_sales.csv" \
+-T "doc_ddl_dir/store_sales.csv" \
 http://127.0.0.1:8030/api/doc_tpcds/store_sales/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "ca_address_sk, ca_address_id, ca_street_number, ca_street_name, 
ca_street_type, ca_suite_number, ca_city, ca_county, ca_state, ca_zip, 
ca_country, ca_gmt_offset, ca_location_type" \
--T "/path/to/data/customer_address_1_10.dat" \
+-T "doc_ddl_dir/customer_address_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/customer_address/_stream_load
 ```
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/query-data/window-function.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/query-data/window-function.md
index a74ab33daf4..175b7b472b1 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/query-data/window-function.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/query-data/window-function.md
@@ -575,9 +575,12 @@ FROM
 
 ## 附录
 
-建表和加载数据:
+示例中使用到的表的建表语句如下:
 
 ```sql
+CREATE DATABASE IF NOT EXISTS doc_tpcds;
+USE doc_tpcds;
+
 CREATE TABLE IF NOT EXISTS item (
     i_item_sk bigint not null,
     i_item_id char(16) not null,
@@ -695,33 +698,39 @@ DISTRIBUTED BY HASH(ca_address_sk) BUCKETS 12
 PROPERTIES (
   "replication_num" = "1"
 );
+```
+
+在终端执行如下命令,下载数据到本地,并使用Stream Load的方式加载数据:
+
+```shell
+curl -L https://cdn.selectdb.com/static/doc_ddl_dir_d27a752a7b.tar -o - | tar 
-Jxf -
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: i_item_sk, i_item_id, i_rec_start_date, i_rec_end_date, 
i_item_desc, i_current_price, i_wholesale_cost, i_brand_id, i_brand, 
i_class_id, i_class, i_category_id, i_category, i_manufact_id, i_manufact, 
i_size, i_formulation, i_color, i_units, i_container, i_manager_id, 
i_product_name" \
--T "/path/to/data/item_1_10.dat" \
+-T "doc_ddl_dir/item_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/item/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: d_date_sk, d_date_id, d_date, d_month_seq, d_week_seq, 
d_quarter_seq, d_year, d_dow, d_moy, d_dom, d_qoy, d_fy_year, d_fy_quarter_seq, 
d_fy_week_seq, d_day_name, d_quarter_name, d_holiday, d_weekend, 
d_following_holiday, d_first_dom, d_last_dom, d_same_day_ly, d_same_day_lq, 
d_current_day, d_current_week, d_current_month, d_current_quarter, 
d_current_year" \
--T "/path/to/data/date_dim_1_10.dat" \
+-T "doc_ddl_dir/date_dim_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/date_dim/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: ss_sold_date_sk, ss_sold_time_sk, ss_item_sk, ss_customer_sk, 
ss_cdemo_sk, ss_hdemo_sk, ss_addr_sk, ss_store_sk, ss_promo_sk, 
ss_ticket_number, ss_quantity, ss_wholesale_cost, ss_list_price, 
ss_sales_price, ss_ext_discount_amt, ss_ext_sales_price, ss_ext_wholesale_cost, 
ss_ext_list_price, ss_ext_tax, ss_coupon_amt, ss_net_paid, ss_net_paid_inc_tax, 
ss_net_profit" \
--T "/path/to/data/store_sales.csv" \
+-T "doc_ddl_dir/store_sales.csv" \
 http://127.0.0.1:8030/api/doc_tpcds/store_sales/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "ca_address_sk, ca_address_id, ca_street_number, ca_street_name, 
ca_street_type, ca_suite_number, ca_city, ca_county, ca_state, ca_zip, 
ca_country, ca_gmt_offset, ca_location_type" \
--T "/path/to/data/customer_address_1_10.dat" \
+-T "doc_ddl_dir/customer_address_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/customer_address/_stream_load
 ```
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/query-data/window-function.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/query-data/window-function.md
index a74ab33daf4..175b7b472b1 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/query-data/window-function.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/query-data/window-function.md
@@ -575,9 +575,12 @@ FROM
 
 ## 附录
 
-建表和加载数据:
+示例中使用到的表的建表语句如下:
 
 ```sql
+CREATE DATABASE IF NOT EXISTS doc_tpcds;
+USE doc_tpcds;
+
 CREATE TABLE IF NOT EXISTS item (
     i_item_sk bigint not null,
     i_item_id char(16) not null,
@@ -695,33 +698,39 @@ DISTRIBUTED BY HASH(ca_address_sk) BUCKETS 12
 PROPERTIES (
   "replication_num" = "1"
 );
+```
+
+在终端执行如下命令,下载数据到本地,并使用Stream Load的方式加载数据:
+
+```shell
+curl -L https://cdn.selectdb.com/static/doc_ddl_dir_d27a752a7b.tar -o - | tar 
-Jxf -
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: i_item_sk, i_item_id, i_rec_start_date, i_rec_end_date, 
i_item_desc, i_current_price, i_wholesale_cost, i_brand_id, i_brand, 
i_class_id, i_class, i_category_id, i_category, i_manufact_id, i_manufact, 
i_size, i_formulation, i_color, i_units, i_container, i_manager_id, 
i_product_name" \
--T "/path/to/data/item_1_10.dat" \
+-T "doc_ddl_dir/item_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/item/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: d_date_sk, d_date_id, d_date, d_month_seq, d_week_seq, 
d_quarter_seq, d_year, d_dow, d_moy, d_dom, d_qoy, d_fy_year, d_fy_quarter_seq, 
d_fy_week_seq, d_day_name, d_quarter_name, d_holiday, d_weekend, 
d_following_holiday, d_first_dom, d_last_dom, d_same_day_ly, d_same_day_lq, 
d_current_day, d_current_week, d_current_month, d_current_quarter, 
d_current_year" \
--T "/path/to/data/date_dim_1_10.dat" \
+-T "doc_ddl_dir/date_dim_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/date_dim/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: ss_sold_date_sk, ss_sold_time_sk, ss_item_sk, ss_customer_sk, 
ss_cdemo_sk, ss_hdemo_sk, ss_addr_sk, ss_store_sk, ss_promo_sk, 
ss_ticket_number, ss_quantity, ss_wholesale_cost, ss_list_price, 
ss_sales_price, ss_ext_discount_amt, ss_ext_sales_price, ss_ext_wholesale_cost, 
ss_ext_list_price, ss_ext_tax, ss_coupon_amt, ss_net_paid, ss_net_paid_inc_tax, 
ss_net_profit" \
--T "/path/to/data/store_sales.csv" \
+-T "doc_ddl_dir/store_sales.csv" \
 http://127.0.0.1:8030/api/doc_tpcds/store_sales/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "ca_address_sk, ca_address_id, ca_street_number, ca_street_name, 
ca_street_type, ca_suite_number, ca_city, ca_county, ca_state, ca_zip, 
ca_country, ca_gmt_offset, ca_location_type" \
--T "/path/to/data/customer_address_1_10.dat" \
+-T "doc_ddl_dir/customer_address_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/customer_address/_stream_load
 ```
 
diff --git a/versioned_docs/version-2.1/query-data/window-function.md 
b/versioned_docs/version-2.1/query-data/window-function.md
index 89e6d63e160..a071c2f3d11 100644
--- a/versioned_docs/version-2.1/query-data/window-function.md
+++ b/versioned_docs/version-2.1/query-data/window-function.md
@@ -582,9 +582,12 @@ This results in a consistent query output:
 
 ## Reference
 
-Create table and load data :
+The table creation statement used in the example is as follows:
 
 ```sql
+CREATE DATABASE IF NOT EXISTS doc_tpcds;
+USE doc_tpcds;
+
 CREATE TABLE IF NOT EXISTS item (
     i_item_sk bigint not null,
     i_item_id char(16) not null,
@@ -702,33 +705,39 @@ DISTRIBUTED BY HASH(ca_address_sk) BUCKETS 12
 PROPERTIES (
   "replication_num" = "1"
 );
+```
+
+Execute the following command on the terminal to download the data to the 
local computer and load the data into the table using the Stream Load method:
+
+```shell
+curl -L https://cdn.selectdb.com/static/doc_ddl_dir_d27a752a7b.tar -o - | tar 
-Jxf -
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: i_item_sk, i_item_id, i_rec_start_date, i_rec_end_date, 
i_item_desc, i_current_price, i_wholesale_cost, i_brand_id, i_brand, 
i_class_id, i_class, i_category_id, i_category, i_manufact_id, i_manufact, 
i_size, i_formulation, i_color, i_units, i_container, i_manager_id, 
i_product_name" \
--T "/path/to/data/item_1_10.dat" \
+-T "doc_ddl_dir/item_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/item/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: d_date_sk, d_date_id, d_date, d_month_seq, d_week_seq, 
d_quarter_seq, d_year, d_dow, d_moy, d_dom, d_qoy, d_fy_year, d_fy_quarter_seq, 
d_fy_week_seq, d_day_name, d_quarter_name, d_holiday, d_weekend, 
d_following_holiday, d_first_dom, d_last_dom, d_same_day_ly, d_same_day_lq, 
d_current_day, d_current_week, d_current_month, d_current_quarter, 
d_current_year" \
--T "/path/to/data/date_dim_1_10.dat" \
+-T "doc_ddl_dir/date_dim_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/date_dim/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: ss_sold_date_sk, ss_sold_time_sk, ss_item_sk, ss_customer_sk, 
ss_cdemo_sk, ss_hdemo_sk, ss_addr_sk, ss_store_sk, ss_promo_sk, 
ss_ticket_number, ss_quantity, ss_wholesale_cost, ss_list_price, 
ss_sales_price, ss_ext_discount_amt, ss_ext_sales_price, ss_ext_wholesale_cost, 
ss_ext_list_price, ss_ext_tax, ss_coupon_amt, ss_net_paid, ss_net_paid_inc_tax, 
ss_net_profit" \
--T "/path/to/data/store_sales.csv" \
+-T "doc_ddl_dir/store_sales.csv" \
 http://127.0.0.1:8030/api/doc_tpcds/store_sales/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "ca_address_sk, ca_address_id, ca_street_number, ca_street_name, 
ca_street_type, ca_suite_number, ca_city, ca_county, ca_state, ca_zip, 
ca_country, ca_gmt_offset, ca_location_type" \
--T "/path/to/data/customer_address_1_10.dat" \
+-T "doc_ddl_dir/customer_address_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/customer_address/_stream_load
 ```
 
diff --git a/versioned_docs/version-3.0/query-data/window-function.md 
b/versioned_docs/version-3.0/query-data/window-function.md
index d1b09006f26..5df4ab55ede 100644
--- a/versioned_docs/version-3.0/query-data/window-function.md
+++ b/versioned_docs/version-3.0/query-data/window-function.md
@@ -583,9 +583,12 @@ For more information on analytic functions, refer to the 
Oracle official documen
 
 ## Reference
 
-Create table and load data :
+The table creation statement used in the example is as follows:
 
 ```sql
+CREATE DATABASE IF NOT EXISTS doc_tpcds;
+USE doc_tpcds;
+
 CREATE TABLE IF NOT EXISTS item (
     i_item_sk bigint not null,
     i_item_id char(16) not null,
@@ -703,33 +706,39 @@ DISTRIBUTED BY HASH(ca_address_sk) BUCKETS 12
 PROPERTIES (
   "replication_num" = "1"
 );
+```
+
+Execute the following command on the terminal to download the data to the 
local computer and load the data into the table using the Stream Load method:
+
+```shell
+curl -L https://cdn.selectdb.com/static/doc_ddl_dir_d27a752a7b.tar -o - | tar 
-Jxf -
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: i_item_sk, i_item_id, i_rec_start_date, i_rec_end_date, 
i_item_desc, i_current_price, i_wholesale_cost, i_brand_id, i_brand, 
i_class_id, i_class, i_category_id, i_category, i_manufact_id, i_manufact, 
i_size, i_formulation, i_color, i_units, i_container, i_manager_id, 
i_product_name" \
--T "/path/to/data/item_1_10.dat" \
+-T "doc_ddl_dir/item_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/item/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: d_date_sk, d_date_id, d_date, d_month_seq, d_week_seq, 
d_quarter_seq, d_year, d_dow, d_moy, d_dom, d_qoy, d_fy_year, d_fy_quarter_seq, 
d_fy_week_seq, d_day_name, d_quarter_name, d_holiday, d_weekend, 
d_following_holiday, d_first_dom, d_last_dom, d_same_day_ly, d_same_day_lq, 
d_current_day, d_current_week, d_current_month, d_current_quarter, 
d_current_year" \
--T "/path/to/data/date_dim_1_10.dat" \
+-T "doc_ddl_dir/date_dim_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/date_dim/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "columns: ss_sold_date_sk, ss_sold_time_sk, ss_item_sk, ss_customer_sk, 
ss_cdemo_sk, ss_hdemo_sk, ss_addr_sk, ss_store_sk, ss_promo_sk, 
ss_ticket_number, ss_quantity, ss_wholesale_cost, ss_list_price, 
ss_sales_price, ss_ext_discount_amt, ss_ext_sales_price, ss_ext_wholesale_cost, 
ss_ext_list_price, ss_ext_tax, ss_coupon_amt, ss_net_paid, ss_net_paid_inc_tax, 
ss_net_profit" \
--T "/path/to/data/store_sales.csv" \
+-T "doc_ddl_dir/store_sales.csv" \
 http://127.0.0.1:8030/api/doc_tpcds/store_sales/_stream_load
 
 curl --location-trusted \
 -u "root:" \
 -H "column_separator:|" \
 -H "ca_address_sk, ca_address_id, ca_street_number, ca_street_name, 
ca_street_type, ca_suite_number, ca_city, ca_county, ca_state, ca_zip, 
ca_country, ca_gmt_offset, ca_location_type" \
--T "/path/to/data/customer_address_1_10.dat" \
+-T "doc_ddl_dir/customer_address_1_10.dat" \
 http://127.0.0.1:8030/api/doc_tpcds/customer_address/_stream_load
 ```
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org

Reply via email to