worker24h commented on a change in pull request #3553:
URL: https://github.com/apache/incubator-doris/pull/3553#discussion_r427115410
##########
File path: docs/zh-CN/sql-reference/sql-statements/Data Manipulation/STREAM
LOAD.md
##########
@@ -67,13 +67,28 @@ under the License.
比如指定导入到p1, p2分区,-H "partitions: p1, p2"
timeout: 指定导入的超时时间。单位秒。默认是 600 秒。可设置范围为 1 秒 ~ 259200 秒。
-
+
strict_mode: 用户指定此次导入是否开启严格模式,默认为开启。关闭方式为 -H "strict_mode: false"。
timezone: 指定本次导入所使用的时区。默认为东八区。该参数会影响所有导入涉及的和时区有关的函数结果。
-
+
exec_mem_limit: 导入内存限制。默认为 2GB。单位为字节。
+ format: 指定导入数据格式,默认是csv,支持json格式。
+
+ jsonpaths: 导入json方式分为:简单模式和精准模式。
+ 简单模式:没有设置jsonpaths参数即为简单模式,这种模式下要求json数据是对象类型,例如:
+ {"k1":1, "k2":2, "k3":"hello"},其中k1,k2,k3是列名字。
+
+ 精准模式:用于json数据相对复杂,需要通过jsonpaths参数获取相应value。
Review comment:
ok
##########
File path: docs/zh-CN/sql-reference/sql-statements/Data Manipulation/ROUTINE
LOAD.md
##########
@@ -309,6 +321,84 @@ under the License.
"property.ssl.key.password" = "abcdefg",
"property.client.id" = "my_client_id"
);
+ 4. 简单模式导入json
+ CREATE ROUTINE LOAD example_db.test_json_label_1 ON table1
+ COLUMNS(category,price,author)
+ PROPERTIES
+ (
+ "desired_concurrent_number"="3",
+ "max_batch_interval" = "20",
+ "max_batch_rows" = "300000",
+ "max_batch_size" = "209715200",
+ "strict_mode" = "false",
+ "format" = "json"
+ )
+ FROM KAFKA
+ (
+ "kafka_broker_list" = "broker1:9092,broker2:9092,broker3:9092",
+ "kafka_topic" = "my_topic",
+ "kafka_partitions" = "0,1,2",
+ "kafka_offsets" = "0,0,0"
+ );
+ 支持两种json数据格式:
+ 1){"category":"a9jadhx","author":"test","price":895}
+ 2){
+ "RECORDS":[
+ {"category":"a9jadhx","author":"test","price":895},
+ {"category":"axdfa1","author":"EvelynWaugh","price":1299}
+ ]
Review comment:
i changed
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]