[GitHub] [incubator-doris] BiteTheDDDDt commented on pull request #8019: [chore] Fix memory problems in agg_test.cpp.

2022-02-11 Thread GitBox


BiteThet commented on pull request #8019:
URL: https://github.com/apache/incubator-doris/pull/8019#issuecomment-1035960109


   A simpler solution is just add delete at end of function.
   ```cpp
   if (place != nullptr) {
   delete[] place;
   }
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] adonis0147 commented on pull request #8019: [chore] Fix memory problems in agg_test.cpp.

2022-02-11 Thread GitBox


adonis0147 commented on pull request #8019:
URL: https://github.com/apache/incubator-doris/pull/8019#issuecomment-1035966349


   @BiteThet Yes, but I want to follow the principle of RAII in modern C++ 
to avoid call `delete` explicitly.
   I will use another simple way to do it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] qzsee commented on pull request #7952: [Bug] fix using clause npe

2022-02-11 Thread GitBox


qzsee commented on pull request #7952:
URL: https://github.com/apache/incubator-doris/pull/7952#issuecomment-1035968340


   > HI @qzsee , Could you please add the case you described in #7953 to the 
QueryPlanTest.java?
   
   done


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] BiteTheDDDDt commented on pull request #8019: [chore] Fix memory problems in agg_test.cpp.

2022-02-11 Thread GitBox


BiteThet commented on pull request #8019:
URL: https://github.com/apache/incubator-doris/pull/8019#issuecomment-1035971102


   LGTM


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] awakeljw commented on a change in pull request #7972: [vectorized] [join] eliminate branch prediction & opt hash join performance

2022-02-11 Thread GitBox


awakeljw commented on a change in pull request #7972:
URL: https://github.com/apache/incubator-doris/pull/7972#discussion_r804448139



##
File path: be/src/vec/exec/join/vhash_join_node.cpp
##
@@ -168,46 +177,98 @@ struct ProcessHashTableProbe {
 // the output block struct is same with mutable block. we can do more opt 
on it and simplify
 // the logic of probe
 // TODO: opt the visited here to reduce the size of hash table
+template
 Status do_process(HashTableContext& hash_table_ctx, ConstNullMapPtr 
null_map,
   MutableBlock& mutable_block, Block* output_block) {
 using KeyGetter = typename HashTableContext::State;
 using Mapped = typename HashTableContext::Mapped;
 
 KeyGetter key_getter(_probe_raw_ptrs, _join_node->_probe_key_sz, 
nullptr);
-
+
 std::vector items_counts(_probe_rows);
 auto& mcol = mutable_block.mutable_columns();
-
-int right_col_idx = _join_node->_is_right_semi_anti ? 0 : 
_left_table_data_types.size();
-int right_col_len = _right_table_data_types.size();
 int current_offset = 0;
 
 for (; _probe_index < _probe_rows;) {
-// ignore null rows
 if constexpr (ignore_null) {
 if ((*null_map)[_probe_index]) {
 items_counts[_probe_index++] = 0;
 continue;
 }
 }
-
 int repeat_count = 0;
-auto find_result =
-(*null_map)[_probe_index]
+if constexpr (is_inner_join) {
+if (!(*null_map)[_probe_index]) {
+auto find_result = 
key_getter.find_key(hash_table_ctx.hash_table, _probe_index, _arena);
+
+if (find_result.is_found()) {
+auto& mapped = find_result.get_mapped();
+
+// TODO: Iterators are currently considered to be a 
heavy operation and have a certain impact on performance.
+// We should rethink whether to use this iterator mode 
in the future. Now just opt the one row case
+if (mapped.get_row_count() == 1) {
+mapped.visited = true;
+// right semi/anti join should dispose the data in 
hash table
+// after probe data eof
+++repeat_count;
+for (size_t j = 0; j < _right_col_len; ++j) {
+auto& column = 
*mapped.block->get_by_position(j).column;
+mcol[j + _right_col_idx]->insert_from(column, 
mapped.row_num);
+}
+} else {
+if (_probe_index + 2 < _probe_rows)
+key_getter.prefetch(hash_table_ctx.hash_table, 
_probe_index + 2, _arena);
+for (auto it = mapped.begin(); it.ok(); ++it) {
+// right semi/anti join should dispose the 
data in hash table
+// after probe data eof
+++repeat_count;
+for (size_t j = 0; j < _right_col_len; ++j) {
+auto& column = 
*it->block->get_by_position(j).column;
+// TODO: interface insert from cause 
serious performance problems
+//  when column is nullable. Try to make 
more effective way
+mcol[j + 
_right_col_idx]->insert_from(column, it->row_num);
+}
+it->visited = true;
+}
+}
+}
+}
+} else if constexpr (is_left_anti_join) {
+if ((*null_map)[_probe_index]) { 
+++repeat_count;
+} else {
+auto find_result = 
key_getter.find_key(hash_table_ctx.hash_table, _probe_index, _arena);
+if (find_result.is_found()) { 
+//do nothing
+} else {
+++repeat_count;
+}
+}
+} else if constexpr(is_left_semi_join) {
+int repeat_count = 0;
+if (!(*null_map)[_probe_index]) {
+auto find_result = 
key_getter.find_key(hash_table_ctx.hash_table, _probe_index, _arena);
+if (find_result.is_found()) { 
+++repeat_count;
+}
+}
+items_counts[_probe_index++] = repeat_count;

Review comment:
   i agree, and has delete it.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use th

[GitHub] [incubator-doris] BiteTheDDDDt opened a new issue #8020: [Bug] Fix segmentation fault at unalign address cast to int128

2022-02-11 Thread GitBox


BiteThet opened a new issue #8020:
URL: https://github.com/apache/incubator-doris/issues/8020


   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/incubator-doris/issues?q=is%3Aissue) and 
found no similar issues.
   
   
   ### Version
   
   master
   
   ### What's Wrong?
   
   Fix segmentation fault at unalign address cast to int128
   
   ### What You Expected?
   
   fix it
   
   ### How to Reproduce?
   
   _No response_
   
   ### Anything Else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] BiteTheDDDDt opened a new pull request #8021: [Bug] Fix segmentation fault at unalign address cast to int128

2022-02-11 Thread GitBox


BiteThet opened a new pull request #8021:
URL: https://github.com/apache/incubator-doris/pull/8021


   # Proposed changes
   
   Issue Number: close #8020
   
   ## Problem Summary:
   
   Describe the overview of changes.
   
   ## Checklist(Required)
   
   1. Does it affect the original behavior: (Yes/No/I Don't know)
   2. Has unit tests been added: (Yes/No/No Need)
   3. Has document been added or modified: (Yes/No/No Need)
   4. Does it need to update dependencies: (Yes/No)
   5. Are there any changes that cannot be rolled back: (Yes/No)
   
   ## Further comments
   
   If this is a relatively large or complex change, kick off the discussion at 
[d...@doris.apache.org](mailto:d...@doris.apache.org) by explaining why you 
chose the solution you did and what alternatives you considered, etc...
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] yangzhg opened a new pull request #8022: [refactor] remove some unused IR code

2022-02-11 Thread GitBox


yangzhg opened a new pull request #8022:
URL: https://github.com/apache/incubator-doris/pull/8022


   # Proposed changes
   
   Issue Number: close #xxx
   
   ## Problem Summary:
   
   remove some unused IR code
   
   ## Checklist(Required)
   
   1. Does it affect the original behavior: (No)
   2. Has unit tests been added: (No Need)
   3. Has document been added or modified: (No Need)
   4. Does it need to update dependencies: (No)
   5. Are there any changes that cannot be rolled back: (Yes)
   
   ## Further comments
   
   If this is a relatively large or complex change, kick off the discussion at 
[d...@doris.apache.org](mailto:d...@doris.apache.org) by explaining why you 
chose the solution you did and what alternatives you considered, etc...
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] tianhui5 opened a new issue #8025: [Feature] Support load binlog from MySQL directly instead of Canal

2022-02-11 Thread GitBox


tianhui5 opened a new issue #8025:
URL: https://github.com/apache/incubator-doris/issues/8025


   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/incubator-doris/issues?q=is%3Aissue) and 
found no similar issues.
   
   
   ### Description
   
   The current binlog load architecture is like below, relying on Canal service.
   +-+
   |Mysql|
   +--+--+
  | Binlog
   +--v--+
   | Canal Server|
   +---+-^---+
  Get  | |  Ack
   +---|-|---+
   | FE| |   |
   | +-|-|+  |
   | | Sync Job| ||  |
   | |+v-+---+|  |
   | || Canal Client ||  |
   | ||   +---+  ||  |
   | ||   |   Receiver|  ||  |
   | ||   +---+  ||  |
   | ||   +---+  ||  |
   | ||   |   Consumer|  ||  |
   | ||   +---+  ||  |
   | |+--+|  |
   | ++---+--++  |
   It ruins Doris' independency, user have to start a Canal service before use 
binlog load. It maybe not the best design, because it's hard to use, especially 
when there is no canal service in prod env. let alone the affect of avaibility 
and data consistency.
   
   ### Use case
   
   Load MySQL binlog to Doris.
   
   ### Related issues
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] yangzhg merged pull request #8012: [deps] change docker image add perf tools and simdjson

2022-02-11 Thread GitBox


yangzhg merged pull request #8012:
URL: https://github.com/apache/incubator-doris/pull/8012


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] yangzhg merged pull request #8001: [Docs]Fixed typo in `CREATE TABLE`

2022-02-11 Thread GitBox


yangzhg merged pull request #8001:
URL: https://github.com/apache/incubator-doris/pull/8001


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris] branch master updated (a4e7c76 -> 789472a)

2022-02-11 Thread yangzhg
This is an automated email from the ASF dual-hosted git repository.

yangzhg pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-doris.git.


from a4e7c76  [Enhancement] use std::search to replace custom search (#7999)
 add 789472a  [build] change docker image add perf tools and simdjson 
(#8012)

No new revisions were added by this update.

Summary of changes:
 build.sh  |  2 +-
 docker/Dockerfile |  4 ++--
 thirdparty/CHANGELOG.md   |  4 
 thirdparty/build-thirdparty.sh| 17 +
 thirdparty/download-thirdparty.sh |  2 +-
 thirdparty/vars.sh|  9 -
 6 files changed, 33 insertions(+), 5 deletions(-)

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris] branch master updated (789472a -> 0b1b937)

2022-02-11 Thread yangzhg
This is an automated email from the ASF dual-hosted git repository.

yangzhg pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-doris.git.


from 789472a  [build] change docker image add perf tools and simdjson 
(#8012)
 add 0b1b937  [docs] Fixed typo in CREATE TABLE (#8001)

No new revisions were added by this update.

Summary of changes:
 docs/zh-CN/sql-reference/sql-statements/Data Definition/CREATE TABLE.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] yangzhg commented on a change in pull request #7984: [Docs] add rpc function document

2022-02-11 Thread GitBox


yangzhg commented on a change in pull request #7984:
URL: https://github.com/apache/incubator-doris/pull/7984#discussion_r804545557



##
File path: docs/.vuepress/sidebar/en.js
##
@@ -255,7 +255,8 @@ module.exports = [
 directoryPath: "udf/",
 children: [
   "contribute-udf",
-  "user-defined-function",
+  "user-defined-function-native",

Review comment:
   ```suggestion
 "user-defined-function-cpp",
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] yangzhg commented on a change in pull request #7984: [Docs] add rpc function document

2022-02-11 Thread GitBox


yangzhg commented on a change in pull request #7984:
URL: https://github.com/apache/incubator-doris/pull/7984#discussion_r804545701



##
File path: docs/.vuepress/sidebar/zh-CN.js
##
@@ -256,7 +256,8 @@ module.exports = [
 directoryPath: "udf/",
 children: [
   "contribute-udf",
-  "user-defined-function",
+  "user-defined-function-native",

Review comment:
   ```suggestion
 "user-defined-function-cpp",
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] yangzhg commented on a change in pull request #7984: [Docs] add rpc function document

2022-02-11 Thread GitBox


yangzhg commented on a change in pull request #7984:
URL: https://github.com/apache/incubator-doris/pull/7984#discussion_r804548912



##
File path: docs/zh-CN/extending-doris/udf/user-defined-function-rpc.md
##
@@ -0,0 +1,98 @@
+---
+{
+"title": "User Defined Function Rpc",
+"language": "zh-CN"
+}
+---
+
+
+
+# User Defined Function Rpc
+
+可以通过 Rpc 的方式调用函数逻辑,通过 protobuf 进行数据传输,支持 
Java/C++/Python/Ruby/Go/PHP/JavaScript 等多种语言
+
+## 编写 UDF 函数
+
+### 拷贝 proto 文件
+
+拷贝 gensrc/proto/function_service.proto 和 gensrc/proto/types.proto 到 Rpc 服务中
+
+- function_service.proto
+  - PFunctionCallRequest
+- function_name:函数名称,对应创建函数时指定的symbol
+- args:方法传递的参数
+- context:查询上下文信息
+  - PFunctionCallResponse
+- result:结果
+- status:状态,0代表正常
+  - PCheckFunctionRequest
+- function:函数相关信息
+- match_type:匹配类型
+  - PCheckFunctionResponse
+- status:状态,0代表正常
+
+### 生成接口
+
+通过 protoc 生成代码,具体参数通过 protoc -h 查看
+
+### 实现接口
+共需要实现以下三个方法
+- fnCall:用于编写计算逻辑
+- checkFn:用于创建 UDF 时校验,校验函数名/参数/返回值等是否合法
+- handShake:用于接口探活
+
+## 创建 UDF
+
+目前暂不支持 UDAF 和 UDTF
+
+```
+CREATE FUNCTION 
+   name ([,...])
+   [RETURNS] rettype
+   PROPERTIES (["key"="value"][,...])
+   
+```
+说明:
+
+1. PROPERTIES中`symbol`表示的是 rpc 调用传递的方法名,这个参数是必须设定的。
+2. PROPERTIES中`object_file`表示的 rpc 服务地址,目前仅支持单个地址,这个参数是必须设定的。

Review comment:
   ```suggestion
   2. PROPERTIES中`object_file`表示的 rpc 服务地址,目前仅支持单个地址和brpc 兼容格式的集群地址,集群连接方式 参考 
[格式说明](https://github.com/apache/incubator-brpc/blob/master/docs/cn/client.md#%E8%BF%9E%E6%8E%A5%E6%9C%8D%E5%8A%A1%E9%9B%86%E7%BE%A4)
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] yangzhg commented on a change in pull request #7984: [Docs] add rpc function document

2022-02-11 Thread GitBox


yangzhg commented on a change in pull request #7984:
URL: https://github.com/apache/incubator-doris/pull/7984#discussion_r804548912



##
File path: docs/zh-CN/extending-doris/udf/user-defined-function-rpc.md
##
@@ -0,0 +1,98 @@
+---
+{
+"title": "User Defined Function Rpc",
+"language": "zh-CN"
+}
+---
+
+
+
+# User Defined Function Rpc
+
+可以通过 Rpc 的方式调用函数逻辑,通过 protobuf 进行数据传输,支持 
Java/C++/Python/Ruby/Go/PHP/JavaScript 等多种语言
+
+## 编写 UDF 函数
+
+### 拷贝 proto 文件
+
+拷贝 gensrc/proto/function_service.proto 和 gensrc/proto/types.proto 到 Rpc 服务中
+
+- function_service.proto
+  - PFunctionCallRequest
+- function_name:函数名称,对应创建函数时指定的symbol
+- args:方法传递的参数
+- context:查询上下文信息
+  - PFunctionCallResponse
+- result:结果
+- status:状态,0代表正常
+  - PCheckFunctionRequest
+- function:函数相关信息
+- match_type:匹配类型
+  - PCheckFunctionResponse
+- status:状态,0代表正常
+
+### 生成接口
+
+通过 protoc 生成代码,具体参数通过 protoc -h 查看
+
+### 实现接口
+共需要实现以下三个方法
+- fnCall:用于编写计算逻辑
+- checkFn:用于创建 UDF 时校验,校验函数名/参数/返回值等是否合法
+- handShake:用于接口探活
+
+## 创建 UDF
+
+目前暂不支持 UDAF 和 UDTF
+
+```
+CREATE FUNCTION 
+   name ([,...])
+   [RETURNS] rettype
+   PROPERTIES (["key"="value"][,...])
+   
+```
+说明:
+
+1. PROPERTIES中`symbol`表示的是 rpc 调用传递的方法名,这个参数是必须设定的。
+2. PROPERTIES中`object_file`表示的 rpc 服务地址,目前仅支持单个地址,这个参数是必须设定的。

Review comment:
   ```suggestion
   2. PROPERTIES中`object_file`表示的 rpc 服务地址,目前支持单个地址和brpc 兼容格式的集群地址,集群连接方式 参考 
[格式说明](https://github.com/apache/incubator-brpc/blob/master/docs/cn/client.md#%E8%BF%9E%E6%8E%A5%E6%9C%8D%E5%8A%A1%E9%9B%86%E7%BE%A4)
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] yangzhg commented on pull request #7984: [Docs] add rpc function document

2022-02-11 Thread GitBox


yangzhg commented on pull request #7984:
URL: https://github.com/apache/incubator-doris/pull/7984#issuecomment-1036089780


   Please add an English version


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-connectors] branch master updated: [community](license) Add license file for dependencies (#5)

2022-02-11 Thread jiafengzheng
This is an automated email from the ASF dual-hosted git repository.

jiafengzheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-doris-connectors.git


The following commit(s) were added to refs/heads/master by this push:
 new f7f995a  [community](license) Add license file for dependencies (#5)
f7f995a is described below

commit f7f995a91a0cb90b800579a56fd48a2c3c276570
Author: Mingyu Chen 
AuthorDate: Fri Feb 11 21:03:21 2022 +0800

[community](license) Add license file for dependencies (#5)
---
 .asf.yaml|  5 +++
 LICENSE-flink-connector-dependencies.txt | 27 ++
 LICENSE-spark-connector-dependencies.txt | 64 
 flink-doris-connector/pom.xml| 14 +++
 spark-doris-connector/pom.xml| 13 +++
 5 files changed, 123 insertions(+)

diff --git a/.asf.yaml b/.asf.yaml
index b9656d8..7ca5cc2 100644
--- a/.asf.yaml
+++ b/.asf.yaml
@@ -31,5 +31,10 @@ github:
 squash:  true
 merge:   false
 rebase:  false
+  protected_branches:
+master:
+  required_pull_request_reviews:
+dismiss_stale_reviews: true
+required_approving_review_count: 1
   notifications:
 pullrequests_status:  commits@doris.apache.org
diff --git a/LICENSE-flink-connector-dependencies.txt 
b/LICENSE-flink-connector-dependencies.txt
new file mode 100644
index 000..55c84e2
--- /dev/null
+++ b/LICENSE-flink-connector-dependencies.txt
@@ -0,0 +1,27 @@
+This file contains all third-party dependencies that are not part of the APLv2 
protocol.
+Generated by license-maven-plugin.
+
+Each time you build flink-connector, ./target/classes/THIRD-PARTY.txt is 
generated.
+If new dependencies are added, we need to update this file.
+==
+
+(New BSD License) Kryo (com.esotericsoftware.kryo:kryo:2.24.0 - 
https://github.com/EsotericSoftware/kryo)
+(New BSD License) MinLog (com.esotericsoftware.minlog:minlog:1.2 - 
http://code.google.com/p/minlog/)
+(MIT License) scopt (com.github.scopt:scopt_2.12:3.5.0 - 
https://github.com/scopt/scopt)
+(CDDL + GPLv2 with classpath exception) javax.annotation API 
(javax.annotation:javax.annotation-api:1.3.2 - 
http://jcp.org/en/jsr/detail?id=250)
+(Common Public License Version 1.0) JUnit (junit:junit:4.11 - http://junit.org)
+(BSD) grizzled-slf4j (org.clapper:grizzled-slf4j_2.12:1.3.2 - 
http://software.clapper.org/grizzled-slf4j/)
+(New BSD License) commons-compiler (org.codehaus.janino:commons-compiler:3.0.9 
- http://janino-compiler.github.io/commons-compiler/)
+(New BSD License) janino (org.codehaus.janino:janino:3.0.9 - 
http://janino-compiler.github.io/janino/)
+(New BSD License) Hamcrest Core (org.hamcrest:hamcrest-core:1.3 - 
https://github.com/hamcrest/JavaHamcrest/hamcrest-core)
+(The MIT License) mockito-core (org.mockito:mockito-core:2.27.0 - 
https://github.com/mockito/mockito)
+(MIT) mockito-scala (org.mockito:mockito-scala_2.12:1.4.7 - 
https://github.com/mockito/mockito-scala)
+(CC0) reactive-streams (org.reactivestreams:reactive-streams:1.0.2 - 
http://www.reactive-streams.org/)
+(The New BSD License) (WTFPL) Reflections (org.reflections:reflections:0.9.10 
- http://github.com/ronmamo/reflections)
+(BSD 3-Clause) Scala Compiler (org.scala-lang:scala-compiler:2.12.7 - 
http://www.scala-lang.org/)
+(BSD 3-clause) scala-java8-compat 
(org.scala-lang.modules:scala-java8-compat_2.12:0.8.0 - 
http://www.scala-lang.org/)
+(BSD 3-clause) scala-parser-combinators 
(org.scala-lang.modules:scala-parser-combinators_2.12:1.1.1 - 
http://www.scala-lang.org/)
+(BSD 3-clause) scala-xml (org.scala-lang.modules:scala-xml_2.12:1.0.6 - 
http://www.scala-lang.org/)
+(MIT License) SLF4J API Module (org.slf4j:slf4j-api:1.7.25 - 
http://www.slf4j.org)
+(MIT License) SLF4J LOG4J-12 Binding (org.slf4j:slf4j-log4j12:1.7.25 - 
http://www.slf4j.org)
+(The MIT License) generics-resolver (ru.vyarus:generics-resolver:3.0.0 - 
https://github.com/xvik/generics-resolver)
diff --git a/LICENSE-spark-connector-dependencies.txt 
b/LICENSE-spark-connector-dependencies.txt
new file mode 100644
index 000..9306e2a
--- /dev/null
+++ b/LICENSE-spark-connector-dependencies.txt
@@ -0,0 +1,64 @@
+This file contains all third-party dependencies that are not part of the APLv2 
protocol.
+Generated by license-maven-plugin.
+
+Each time you build spark-connector, ./target/classes/THIRD-PARTY.txt is 
generated.
+If new dependencies are added, we need to update this file.
+==
+
+(New BSD License) Kryo Shaded (com.esotericsoftware:kryo-shaded:3.0.3 - 
https://github.com/EsotericSoftware/kryo/kryo-shaded)
+(New BSD License) MinLog (com.esotericsoftware:minlog:1.3.0 - 
https://github.com/EsotericSoftware/minlog)
+(BSD 2-Clause License) zstd-jni (com.github.luben:zstd-jni:1.3.2-2 - 
https://github.com/luben/

[GitHub] [incubator-doris] yiguolei opened a new pull request #8026: [Refactor] Remove snapshot converter and unused Protobuf Definitions

2022-02-11 Thread GitBox


yiguolei opened a new pull request #8026:
URL: https://github.com/apache/incubator-doris/pull/8026


   1. remove snapshot converter
   2. remove unused protobuf definitions
   3. move some macro as const variables


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] bluemasion commented on issue #6530: Build failed in centos

2022-02-11 Thread GitBox


bluemasion commented on issue #6530:
URL: 
https://github.com/apache/incubator-doris/issues/6530#issuecomment-1036204167


   hi ,I met the same issue  ,and also updated the gcc to version 8,but still 
got the issue, cloud give me some advise to solve it ?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] hf200012 merged pull request #8009: [docs] correct mysql jdbc auto-retry url format

2022-02-11 Thread GitBox


hf200012 merged pull request #8009:
URL: https://github.com/apache/incubator-doris/pull/8009


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris] branch master updated (0b1b937 -> 7648303)

2022-02-11 Thread jiafengzheng
This is an automated email from the ASF dual-hosted git repository.

jiafengzheng pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-doris.git.


from 0b1b937  [docs] Fixed typo in CREATE TABLE (#8001)
 add 7648303  [docs](config) correct mysql jdbc auto-retry url format 
(#8009)

No new revisions were added by this update.

Summary of changes:
 docs/en/getting-started/advance-usage.md| 2 +-
 docs/zh-CN/getting-started/advance-usage.md | 4 ++--
 2 files changed, 3 insertions(+), 3 deletions(-)

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] morningman closed pull request #7993: [fix](spark connector) fix spark connector unsupport STRING type.

2022-02-11 Thread GitBox


morningman closed pull request #7993:
URL: https://github.com/apache/incubator-doris/pull/7993


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] morningman commented on pull request #7880: [New feature](statistics) Step1: Statistics collection framework

2022-02-11 Thread GitBox


morningman commented on pull request #7880:
URL: https://github.com/apache/incubator-doris/pull/7880#issuecomment-1036248906


   Compile failed


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] github-actions[bot] commented on pull request #8019: [chore] Fix memory problems in agg_test.cpp.

2022-02-11 Thread GitBox


github-actions[bot] commented on pull request #8019:
URL: https://github.com/apache/incubator-doris/pull/8019#issuecomment-1036272233






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] github-actions[bot] commented on pull request #8008: [Refactor] remove plugin folder in be since it is useless

2022-02-11 Thread GitBox


github-actions[bot] commented on pull request #8008:
URL: https://github.com/apache/incubator-doris/pull/8008#issuecomment-1036272917






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] github-actions[bot] commented on pull request #8026: [Refactor] Remove snapshot converter and unused Protobuf Definitions

2022-02-11 Thread GitBox


github-actions[bot] commented on pull request #8026:
URL: https://github.com/apache/incubator-doris/pull/8026#issuecomment-1036274153






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] github-actions[bot] commented on pull request #8022: [refactor] remove some unused IR code

2022-02-11 Thread GitBox


github-actions[bot] commented on pull request #8022:
URL: https://github.com/apache/incubator-doris/pull/8022#issuecomment-1036278808






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] github-actions[bot] commented on pull request #7989: [fix] (grouping set) fix Unexpected exception: bitIndex < 0: -1

2022-02-11 Thread GitBox


github-actions[bot] commented on pull request #7989:
URL: https://github.com/apache/incubator-doris/pull/7989#issuecomment-1036279899






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-flink-connector] 03/32: [Log] Fix a mistake in DorisDynamicOutputFormat.java (#5963)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit d07c904c8dc7c20311325f45f20ce4b86ccb0ad3
Author: zhangboya1 <49148006+zhangbo...@users.noreply.github.com>
AuthorDate: Sun Jun 6 22:06:57 2021 +0800

[Log] Fix a mistake in DorisDynamicOutputFormat.java (#5963)

Fix a mistake DorisDynamicOutputFormat.java
---
 .../java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git 
a/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java 
b/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
index baa7b07..44880b5 100644
--- a/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
+++ b/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
@@ -70,7 +70,7 @@ public class DorisDynamicOutputFormat extends 
RichOutputFormat  {
 options.getTableIdentifier().split("\\.")[1],
 options.getUsername(),
 options.getPassword());
-LOG.info("Steamload BE:{}",dorisStreamLoad.getLoadUrlStr());
+LOG.info("Streamload BE:{}",dorisStreamLoad.getLoadUrlStr());
 }
 
 @Override

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-flink-connector] 06/32: [Feature]:Flink-connector supports streamload parameters (#6243)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit ba29ce0d7c8ba8427c1d70d13b37d1592f1279df
Author: wudi <676366...@qq.com>
AuthorDate: Mon Aug 9 22:12:46 2021 +0800

[Feature]:Flink-connector supports streamload parameters (#6243)

Flink-connector supports streamload parameters
#6199
---
 .../apache/doris/flink/backend/BackendClient.java  |   5 +-
 .../doris/flink/cfg/DorisConnectionOptions.java|   2 +-
 .../doris/flink/cfg/DorisExecutionOptions.java |  23 +-
 .../org/apache/doris/flink/cfg/DorisOptions.java   |   2 +-
 .../apache/doris/flink/cfg/DorisReadOptions.java   |   8 +-
 .../flink/datastream/DorisSourceFunction.java  |  16 +-
 .../SimpleListDeserializationSchema.java   |   3 +-
 .../doris/flink/exception/DorisException.java  |   8 +-
 .../exception/ShouldNeverHappenException.java  |   3 +-
 .../doris/flink/exception/StreamLoadException.java |   4 +
 .../doris/flink/rest/PartitionDefinition.java  |   2 +-
 .../org/apache/doris/flink/rest/RestService.java   |  95 +++--
 .../org/apache/doris/flink/rest/SchemaUtils.java   |   3 +-
 .../org/apache/doris/flink/rest/models/Field.java  |   3 +-
 .../apache/doris/flink/serialization/RowBatch.java |   8 +-
 .../flink/table/DorisDynamicOutputFormat.java  |  57 +--
 .../flink/table/DorisDynamicTableFactory.java  | 427 +++--
 .../doris/flink/table/DorisDynamicTableSink.java   |   2 +-
 .../doris/flink/table/DorisDynamicTableSource.java |  90 ++---
 .../doris/flink/table/DorisRowDataInputFormat.java | 358 -
 .../apache/doris/flink/table/DorisStreamLoad.java  |  35 +-
 .../doris/flink/table/DorisTableInputSplit.java|   8 +-
 22 files changed, 624 insertions(+), 538 deletions(-)

diff --git a/src/main/java/org/apache/doris/flink/backend/BackendClient.java 
b/src/main/java/org/apache/doris/flink/backend/BackendClient.java
index 93b353c..40bb5c9 100644
--- a/src/main/java/org/apache/doris/flink/backend/BackendClient.java
+++ b/src/main/java/org/apache/doris/flink/backend/BackendClient.java
@@ -112,6 +112,7 @@ public class BackendClient {
 
 /**
  * Open a scanner for reading Doris data.
+ *
  * @param openParams thrift struct to required by request
  * @return scan open result
  * @throws ConnectedFailedException throw if cannot connect to Doris BE
@@ -147,6 +148,7 @@ public class BackendClient {
 
 /**
  * get next row batch from Doris BE
+ *
  * @param nextBatchParams thrift struct to required by request
  * @return scan batch result
  * @throws ConnectedFailedException throw if cannot connect to Doris BE
@@ -161,7 +163,7 @@ public class BackendClient {
 for (int attempt = 0; attempt < retries; ++attempt) {
 logger.debug("Attempt {} to getNext {}.", attempt, routing);
 try {
-result  = client.get_next(nextBatchParams);
+result = client.get_next(nextBatchParams);
 if (result == null) {
 logger.warn("GetNext result from {} is null.", routing);
 continue;
@@ -189,6 +191,7 @@ public class BackendClient {
 
 /**
  * close an scanner.
+ *
  * @param closeParams thrift struct to required by request
  */
 public void closeScanner(TScanCloseParams closeParams) {
diff --git 
a/src/main/java/org/apache/doris/flink/cfg/DorisConnectionOptions.java 
b/src/main/java/org/apache/doris/flink/cfg/DorisConnectionOptions.java
index 619ce74..9b2187c 100644
--- a/src/main/java/org/apache/doris/flink/cfg/DorisConnectionOptions.java
+++ b/src/main/java/org/apache/doris/flink/cfg/DorisConnectionOptions.java
@@ -21,7 +21,7 @@ import org.apache.flink.util.Preconditions;
 import java.io.Serializable;
 
 /**
- *  Doris connection options.
+ * Doris connection options.
  */
 public class DorisConnectionOptions implements Serializable {
 
diff --git 
a/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java 
b/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java
index 330cbc9..3d035ab 100644
--- a/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java
+++ b/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java
@@ -21,22 +21,29 @@ import org.apache.flink.util.Preconditions;
 
 import java.io.Serializable;
 import java.time.Duration;
+import java.util.Properties;
 
 /**
  * JDBC sink batch options.
  */
-public class DorisExecutionOptions  implements Serializable {
+public class DorisExecutionOptions implements Serializable {
 private static final long serialVersionUID = 1L;
 
 private final Integer batchSize;
 private final Integer maxRetries;
 private final Long batchIntervalMs;
 
-public DorisExecutionOptions(Integer batchSize, Integer maxRetries,Long 
batchIntervalMs) {
+/**
+ *

[incubator-doris-flink-connector] 16/32: [Build]Compile and output the jar file, add Spark, Flink version and Scala version (#7051)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 86d3e56a397667f7ac8b7f1d71d121d62e6469b5
Author: jiafeng.zhang 
AuthorDate: Tue Nov 9 10:02:08 2021 +0800

[Build]Compile and output the jar file, add Spark, Flink version and Scala 
version (#7051)

The jar file compiled by Flink and Spark Connector, with the corresponding 
Flink, Spark version
and Scala version at compile time, so that users can know whether the 
version number matches when using it.

Example of output file name:doris-spark-1.0.0-spark-3.2.0_2.12.jar
---
 build.sh | 6 ++
 pom.xml  | 2 +-
 2 files changed, 3 insertions(+), 5 deletions(-)

diff --git a/build.sh b/build.sh
index 70f4e96..3be10a0 100644
--- a/build.sh
+++ b/build.sh
@@ -45,14 +45,12 @@ if ! ${MVN_CMD} --version; then
 exit 1
 fi
 export MVN_CMD
-
+rm -rf output/
 ${MVN_CMD} clean package
 
 
 mkdir -p output/
-cp target/doris-flink-1.0-SNAPSHOT.jar ./output/
-cp target/doris-flink-1.0-SNAPSHOT-javadoc.jar ./output/
-cp target/doris-flink-1.0-SNAPSHOT-sources.jar ./output/
+cp target/doris-flink-*.jar ./output/
 
 echo "*"
 echo "Successfully build Flink-Doris-Connector"
diff --git a/pom.xml b/pom.xml
index ffe6784..c5c3905 100644
--- a/pom.xml
+++ b/pom.xml
@@ -6,7 +6,7 @@
 
 org.apache
 doris-flink
-1.0-SNAPSHOT
+1.0.0-flink-${flink.version}_${scala.version}
 
 
 2.12

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-flink-connector] 21/32: [fix](flink-connector) Connector should visit the surviving BE nodes (#7435)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 03120e51d121794327ccc05c7dbb3c1515195731
Author: Heng Zhao 
AuthorDate: Tue Dec 21 11:05:42 2021 +0800

[fix](flink-connector) Connector should visit the surviving BE nodes (#7435)
---
 src/main/java/org/apache/doris/flink/rest/RestService.java | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/src/main/java/org/apache/doris/flink/rest/RestService.java 
b/src/main/java/org/apache/doris/flink/rest/RestService.java
index 82e01e0..0c4264f 100644
--- a/src/main/java/org/apache/doris/flink/rest/RestService.java
+++ b/src/main/java/org/apache/doris/flink/rest/RestService.java
@@ -86,7 +86,7 @@ public class RestService implements Serializable {
 private static final String QUERY_PLAN = "_query_plan";
 @Deprecated
 private static final String BACKENDS = "/rest/v1/system?path=//backends";
-private static final String BACKENDS_V2 = "/api/backends?is_aliva=true";
+private static final String BACKENDS_V2 = "/api/backends?is_alive=true";
 private static final String FE_LOGIN = "/rest/v1/login";
 
 /**

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-flink-connector] 22/32: [improvement](flink-connector) flush data without multi httpclients (#7329) (#7450)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 6c0627affeab6702c56f3727872ec8ffea018baa
Author: Heng Zhao 
AuthorDate: Fri Dec 24 21:28:35 2021 +0800

[improvement](flink-connector) flush data without multi httpclients (#7329) 
(#7450)

reuse http client to flush data
---
 .../flink/table/DorisDynamicOutputFormat.java  | 22 +++--
 .../apache/doris/flink/table/DorisStreamLoad.java  | 36 ++
 2 files changed, 36 insertions(+), 22 deletions(-)

diff --git 
a/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java 
b/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
index f4f49bd..2a1cec4 100644
--- a/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
+++ b/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
@@ -94,7 +94,7 @@ public class DorisDynamicOutputFormat extends 
RichOutputFormat {
 this.readOptions = readOptions;
 this.executionOptions = executionOptions;
 
-Properties streamLoadProp=executionOptions.getStreamLoadProp();
+Properties streamLoadProp = executionOptions.getStreamLoadProp();
 
 boolean ifEscape = 
Boolean.parseBoolean(streamLoadProp.getProperty(ESCAPE_DELIMITERS_KEY, 
ESCAPE_DELIMITERS_DEFAULT));
 if (ifEscape) {
@@ -121,16 +121,16 @@ public class DorisDynamicOutputFormat extends 
RichOutputFormat {
 }
 }
 
-private String escapeString( String s) {
-Pattern p = Pattern.compile("x(\\d{2})");
-Matcher m = p.matcher(s);
+private String escapeString(String s) {
+Pattern p = Pattern.compile("x(\\d{2})");
+Matcher m = p.matcher(s);
 
-StringBuffer buf = new StringBuffer();
-while (m.find()) {
-m.appendReplacement(buf, String.format("%s", (char) 
Integer.parseInt(m.group(1;
-}
-m.appendTail(buf);
-return buf.toString();
+StringBuffer buf = new StringBuffer();
+while (m.find()) {
+m.appendReplacement(buf, String.format("%s", (char) 
Integer.parseInt(m.group(1;
+}
+m.appendTail(buf);
+return buf.toString();
 }
 
 @Override
@@ -220,6 +220,8 @@ public class DorisDynamicOutputFormat extends 
RichOutputFormat {
 } catch (Exception e) {
 LOG.warn("Writing records to doris failed.", e);
 throw new RuntimeException("Writing records to doris failed.", 
e);
+} finally {
+this.dorisStreamLoad.close();
 }
 }
 checkFlushException();
diff --git a/src/main/java/org/apache/doris/flink/table/DorisStreamLoad.java 
b/src/main/java/org/apache/doris/flink/table/DorisStreamLoad.java
index b897ff2..9c05b83 100644
--- a/src/main/java/org/apache/doris/flink/table/DorisStreamLoad.java
+++ b/src/main/java/org/apache/doris/flink/table/DorisStreamLoad.java
@@ -64,6 +64,15 @@ public class DorisStreamLoad implements Serializable {
 private String tbl;
 private String authEncoding;
 private Properties streamLoadProp;
+private final HttpClientBuilder httpClientBuilder = HttpClients
+.custom()
+.setRedirectStrategy(new DefaultRedirectStrategy() {
+@Override
+protected boolean isRedirectable(String method) {
+return true;
+}
+});
+private CloseableHttpClient httpClient;
 
 public DorisStreamLoad(String hostPort, String db, String tbl, String 
user, String passwd, Properties streamLoadProp) {
 this.hostPort = hostPort;
@@ -74,6 +83,7 @@ public class DorisStreamLoad implements Serializable {
 this.loadUrlStr = String.format(loadUrlPattern, hostPort, db, tbl);
 this.authEncoding = basicAuthHeader(user, passwd);
 this.streamLoadProp = streamLoadProp;
+this.httpClient = httpClientBuilder.build();
 }
 
 public String getLoadUrlStr() {
@@ -94,7 +104,7 @@ public class DorisStreamLoad implements Serializable {
 try {
 RespContent respContent = 
OBJECT_MAPPER.readValue(loadResponse.respContent, RespContent.class);
 if (!DORIS_SUCCESS_STATUS.contains(respContent.getStatus())) {
-String errMsg=String.format("stream load error: %s, see 
more in %s",respContent.getMessage(),respContent.getErrorURL());
+String errMsg = String.format("stream load error: %s, see 
more in %s", respContent.getMessage(), respContent.getErrorURL());
 throw new StreamLoadException(errMsg);
 }
 } catch (IOException e) {
@@ -112,16 +122,7 @@ public class DorisStreamLoad implements Serializable {
 UUID.randomUUID().

[incubator-doris-flink-connector] 17/32: [Feature] Support Flink and Spark connector support String type (#7075)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 8b0b9c67428c4799e8cf0b19e8565fa72c09c66b
Author: wudi <676366...@qq.com>
AuthorDate: Sat Nov 13 17:10:22 2021 +0800

[Feature] Support Flink and Spark connector support String type (#7075)

Support String type for Flink and Spark connector
---
 src/main/java/org/apache/doris/flink/serialization/RowBatch.java | 1 +
 src/main/thrift/doris/Types.thrift   | 8 +++-
 2 files changed, 8 insertions(+), 1 deletion(-)

diff --git a/src/main/java/org/apache/doris/flink/serialization/RowBatch.java 
b/src/main/java/org/apache/doris/flink/serialization/RowBatch.java
index 00c699b..3337637 100644
--- a/src/main/java/org/apache/doris/flink/serialization/RowBatch.java
+++ b/src/main/java/org/apache/doris/flink/serialization/RowBatch.java
@@ -251,6 +251,7 @@ public class RowBatch {
 case "DATETIME":
 case "CHAR":
 case "VARCHAR":
+case "STRING":
 
Preconditions.checkArgument(mt.equals(Types.MinorType.VARCHAR),
 typeMismatchMessage(currentType, mt));
 VarCharVector varCharVector = (VarCharVector) 
curFieldVector;
diff --git a/src/main/thrift/doris/Types.thrift 
b/src/main/thrift/doris/Types.thrift
index 2d902ba..44ce606 100644
--- a/src/main/thrift/doris/Types.thrift
+++ b/src/main/thrift/doris/Types.thrift
@@ -73,7 +73,13 @@ enum TPrimitiveType {
   VARCHAR,
   HLL,
   DECIMALV2,
-  TIME
+  TIME,
+  OBJECT,
+  ARRAY,
+  MAP,
+  STRUCT,
+  STRING,
+  ALL
 }
 
 enum TTypeNodeType {

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-flink-connector] 27/32: Flink / Spark connector compilation problem (#7725)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 06f58f2eb0a24b784574609c707ef27bcaba0be7
Author: jiafeng.zhang 
AuthorDate: Fri Jan 14 22:14:48 2022 +0800

Flink / Spark connector compilation problem (#7725)

Flink / Spark connector compilation problem
---
 build.sh |  8 +---
 pom.xml  | 26 ++
 2 files changed, 11 insertions(+), 23 deletions(-)

diff --git a/build.sh b/build.sh
index d363dae..83d26d2 100644
--- a/build.sh
+++ b/build.sh
@@ -62,19 +62,13 @@ if [[ -n ${CUSTOM_MVN} ]]; then
 MVN_CMD=${CUSTOM_MVN}
 fi
 
-if [ -z "$1" ]; then
-export FLINK_VERSION="$1"
-fi
-if [ -z "$2" ]; then
-export SCALA_VERSION="$2"
-fi
 if ! ${MVN_CMD} --version; then
 echo "Error: mvn is not found"
 exit 1
 fi
 export MVN_CMD
 rm -rf output/
-${MVN_CMD} clean package
+${MVN_CMD} clean package -Dscala.version=$2 -Dflink.version=$1
 
 mkdir -p output/
 cp target/doris-flink-*.jar ./output/
diff --git a/pom.xml b/pom.xml
index 11f2b20..21a5a6f 100644
--- a/pom.xml
+++ b/pom.xml
@@ -67,8 +67,8 @@ under the License.
 
 
 
-${env.SCALA_VERSION}
-${env.FLINK_VERSION}
+${env.scala.version}
+${env.flink.version}
 0.13.0
 5.0.0
 3.8.1
@@ -115,28 +115,22 @@ under the License.
 
 
 
-fink-version
+flink.version
+
+1.11.6
+
 
 true
-
-!env.FLINK_VERSION
-
 
-
-1.11.6
-
 
 
-scala-version
+scala.version
+
+2.12
+
 
 true
-
-!env.SCALA_VERSION
-
 
-
-2.12
-
 
 
 

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-flink-connector] 29/32: [chore][fix][doc](fe-plugin)(mysqldump) fix build auditlog plugin error (#7804)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 73b3d5cf8076f80147c7b28ca14ccca2f3aa8251
Author: Zhengguo Yang 
AuthorDate: Wed Jan 26 09:11:23 2022 +0800

[chore][fix][doc](fe-plugin)(mysqldump) fix build auditlog plugin error 
(#7804)

1. fix problems when build fe_plugins
2. format
3. add docs about dump data using mysql dump
---
 pom.xml | 15 +--
 1 file changed, 1 insertion(+), 14 deletions(-)

diff --git a/pom.xml b/pom.xml
index 21a5a6f..10b750a 100644
--- a/pom.xml
+++ b/pom.xml
@@ -1,5 +1,4 @@
 
-
 
-
-http://www.w3.org/2001/XMLSchema-instance";
- xmlns="http://maven.apache.org/POM/4.0.0";
+http://www.w3.org/2001/XMLSchema-instance"; 
xmlns="http://maven.apache.org/POM/4.0.0";
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
 4.0.0
 
@@ -50,7 +47,6 @@ under the License.
 GitHub
 https://github.com/apache/incubator-doris/issues
 
-
 
 
 Dev Mailing List
@@ -58,7 +54,6 @@ under the License.
 dev-subscr...@doris.apache.org
 dev-unsubscr...@doris.apache.org
 
-
 
 Commits Mailing List
 commits@doris.apache.org
@@ -78,7 +73,6 @@ under the License.
 ${env.DORIS_THIRDPARTY}
 github
 
-
 
 
 thirdparty
@@ -99,14 +93,12 @@ under the License.
 env.CUSTOM_MAVEN_REPO
 
 
-
 
 
 custom-nexus
 ${env.CUSTOM_MAVEN_REPO}
 
 
-
 
 
 custom-nexus
@@ -132,7 +124,6 @@ under the License.
 true
 
 
-
 
 
 general-env
@@ -141,7 +132,6 @@ under the License.
 !env.CUSTOM_MAVEN_REPO
 
 
-
 
 
 central
@@ -151,7 +141,6 @@ under the License.
 
 
 
-
 
 
 org.apache.flink
@@ -269,7 +258,6 @@ under the License.
 test
 
 
-
 
 
 
@@ -441,5 +429,4 @@ under the License.
 
 
 
-
 

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-flink-connector] 14/32: support use char like \x01 in flink-doris-sink column & line delimiter (#6937)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit bbfb03c41f1a230728fb3827571ccd0c727c34fe
Author: wunan1210 
AuthorDate: Fri Oct 29 13:56:52 2021 +0800

support use char like \x01 in flink-doris-sink column & line delimiter 
(#6937)

* support use char like \x01 in flink-doris-sink column & line delimiter

* extend imports

* add docs
---
 .../doris/flink/rest/models/RespContent.java   |  4 ++
 .../flink/table/DorisDynamicOutputFormat.java  | 44 +++---
 .../apache/doris/flink/table/DorisStreamLoad.java  |  3 +-
 3 files changed, 44 insertions(+), 7 deletions(-)

diff --git a/src/main/java/org/apache/doris/flink/rest/models/RespContent.java 
b/src/main/java/org/apache/doris/flink/rest/models/RespContent.java
index b86b3dd..07a356c 100644
--- a/src/main/java/org/apache/doris/flink/rest/models/RespContent.java
+++ b/src/main/java/org/apache/doris/flink/rest/models/RespContent.java
@@ -93,4 +93,8 @@ public class RespContent {
 }
 
 }
+
+public String getErrorURL() {
+return ErrorURL;
+}
 }
diff --git 
a/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java 
b/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
index 0fd154a..f4f49bd 100644
--- a/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
+++ b/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
@@ -38,11 +38,14 @@ import java.util.Arrays;
 import java.util.HashMap;
 import java.util.List;
 import java.util.Map;
+import java.util.Properties;
 import java.util.StringJoiner;
 import java.util.concurrent.Executors;
 import java.util.concurrent.ScheduledExecutorService;
 import java.util.concurrent.ScheduledFuture;
 import java.util.concurrent.TimeUnit;
+import java.util.regex.Matcher;
+import java.util.regex.Pattern;
 
 import static org.apache.flink.table.data.RowData.createFieldGetter;
 
@@ -62,9 +65,11 @@ public class DorisDynamicOutputFormat extends 
RichOutputFormat {
 private static final String FORMAT_KEY = "format";
 private static final String FORMAT_JSON_VALUE = "json";
 private static final String NULL_VALUE = "\\N";
+private static final String ESCAPE_DELIMITERS_KEY = "escape_delimiters";
+private static final String ESCAPE_DELIMITERS_DEFAULT = "false";
 
-private final String fieldDelimiter;
-private final String lineDelimiter;
+private String fieldDelimiter;
+private String lineDelimiter;
 private final String[] fieldNames;
 private final boolean jsonFormat;
 private DorisOptions options;
@@ -88,10 +93,26 @@ public class DorisDynamicOutputFormat extends 
RichOutputFormat {
 this.options = option;
 this.readOptions = readOptions;
 this.executionOptions = executionOptions;
-this.fieldDelimiter = 
executionOptions.getStreamLoadProp().getProperty(FIELD_DELIMITER_KEY,
-FIELD_DELIMITER_DEFAULT);
-this.lineDelimiter = 
executionOptions.getStreamLoadProp().getProperty(LINE_DELIMITER_KEY,
-LINE_DELIMITER_DEFAULT);
+
+Properties streamLoadProp=executionOptions.getStreamLoadProp();
+
+boolean ifEscape = 
Boolean.parseBoolean(streamLoadProp.getProperty(ESCAPE_DELIMITERS_KEY, 
ESCAPE_DELIMITERS_DEFAULT));
+if (ifEscape) {
+this.fieldDelimiter = 
escapeString(streamLoadProp.getProperty(FIELD_DELIMITER_KEY,
+FIELD_DELIMITER_DEFAULT));
+this.lineDelimiter = 
escapeString(streamLoadProp.getProperty(LINE_DELIMITER_KEY,
+LINE_DELIMITER_DEFAULT));
+
+if (streamLoadProp.contains(ESCAPE_DELIMITERS_KEY)) {
+streamLoadProp.remove(ESCAPE_DELIMITERS_KEY);
+}
+} else {
+this.fieldDelimiter = 
streamLoadProp.getProperty(FIELD_DELIMITER_KEY,
+FIELD_DELIMITER_DEFAULT);
+this.lineDelimiter = streamLoadProp.getProperty(LINE_DELIMITER_KEY,
+LINE_DELIMITER_DEFAULT);
+}
+
 this.fieldNames = fieldNames;
 this.jsonFormat = 
FORMAT_JSON_VALUE.equals(executionOptions.getStreamLoadProp().getProperty(FORMAT_KEY));
 this.fieldGetters = new RowData.FieldGetter[logicalTypes.length];
@@ -100,6 +121,17 @@ public class DorisDynamicOutputFormat extends 
RichOutputFormat {
 }
 }
 
+private String escapeString( String s) {
+Pattern p = Pattern.compile("x(\\d{2})");
+Matcher m = p.matcher(s);
+
+StringBuffer buf = new StringBuffer();
+while (m.find()) {
+m.appendReplacement(buf, String.format("%s", (char) 
Integer.parseInt(m.group(1;
+}
+m.appendTail(buf);
+return buf.toString();
+}
 
 @Override
 p

[incubator-doris-flink-connector] 08/32: [FlinkConnector] Make flink datastream source parameterized (#6473)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 0fa122007be1aacd0e9af0bc505761d8e8473555
Author: wunan1210 
AuthorDate: Sun Aug 22 22:03:32 2021 +0800

[FlinkConnector] Make flink datastream source parameterized (#6473)

make flink datastream source parameterized as List instead of Object.
---
 .../doris/flink/datastream/DorisSourceFunction.java  | 16 
 .../deserialization/SimpleListDeserializationSchema.java |  8 +---
 .../apache/doris/flink/datastream/ScalaValueReader.scala |  2 +-
 3 files changed, 14 insertions(+), 12 deletions(-)

diff --git 
a/src/main/java/org/apache/doris/flink/datastream/DorisSourceFunction.java 
b/src/main/java/org/apache/doris/flink/datastream/DorisSourceFunction.java
index 82ab224..08ec5b0 100644
--- a/src/main/java/org/apache/doris/flink/datastream/DorisSourceFunction.java
+++ b/src/main/java/org/apache/doris/flink/datastream/DorisSourceFunction.java
@@ -36,17 +36,17 @@ import java.util.List;
  * DorisSource
  **/
 
-public class DorisSourceFunction extends RichSourceFunction implements 
ResultTypeQueryable {
+public class DorisSourceFunction extends RichSourceFunction> 
implements ResultTypeQueryable> {
 
 private static final Logger logger = 
LoggerFactory.getLogger(DorisSourceFunction.class);
 
-private DorisDeserializationSchema deserializer;
-private DorisOptions options;
-private DorisReadOptions readOptions;
+private final DorisDeserializationSchema> deserializer;
+private final DorisOptions options;
+private final DorisReadOptions readOptions;
 private List dorisPartitions;
 private ScalaValueReader scalaValueReader;
 
-public DorisSourceFunction(DorisStreamOptions streamOptions, 
DorisDeserializationSchema deserializer) {
+public DorisSourceFunction(DorisStreamOptions streamOptions, 
DorisDeserializationSchema> deserializer) {
 this.deserializer = deserializer;
 this.options = streamOptions.getOptions();
 this.readOptions = streamOptions.getReadOptions();
@@ -59,11 +59,11 @@ public class DorisSourceFunction extends 
RichSourceFunction implements Res
 }
 
 @Override
-public void run(SourceContext sourceContext) throws Exception {
+public void run(SourceContext> sourceContext) {
 for (PartitionDefinition partitions : dorisPartitions) {
 scalaValueReader = new ScalaValueReader(partitions, options, 
readOptions);
 while (scalaValueReader.hasNext()) {
-Object next = scalaValueReader.next();
+List next = scalaValueReader.next();
 sourceContext.collect(next);
 }
 }
@@ -76,7 +76,7 @@ public class DorisSourceFunction extends 
RichSourceFunction implements Res
 
 
 @Override
-public TypeInformation getProducedType() {
+public TypeInformation> getProducedType() {
 return this.deserializer.getProducedType();
 }
 }
diff --git 
a/src/main/java/org/apache/doris/flink/deserialization/SimpleListDeserializationSchema.java
 
b/src/main/java/org/apache/doris/flink/deserialization/SimpleListDeserializationSchema.java
index 7fcf2f6..d9ec6e5 100644
--- 
a/src/main/java/org/apache/doris/flink/deserialization/SimpleListDeserializationSchema.java
+++ 
b/src/main/java/org/apache/doris/flink/deserialization/SimpleListDeserializationSchema.java
@@ -17,15 +17,17 @@
 package org.apache.doris.flink.deserialization;
 
 
+import org.apache.flink.api.common.typeinfo.TypeHint;
 import org.apache.flink.api.common.typeinfo.TypeInformation;
 
 import java.util.List;
 
 
-public class SimpleListDeserializationSchema implements 
DorisDeserializationSchema {
+public class SimpleListDeserializationSchema implements 
DorisDeserializationSchema> {
 
 @Override
-public TypeInformation getProducedType() {
-return TypeInformation.of(List.class);
+public TypeInformation> getProducedType() {
+return TypeInformation.of(new TypeHint>() {
+});
 }
 }
diff --git 
a/src/main/scala/org/apache/doris/flink/datastream/ScalaValueReader.scala 
b/src/main/scala/org/apache/doris/flink/datastream/ScalaValueReader.scala
index bdf9487..e69a86f 100644
--- a/src/main/scala/org/apache/doris/flink/datastream/ScalaValueReader.scala
+++ b/src/main/scala/org/apache/doris/flink/datastream/ScalaValueReader.scala
@@ -206,7 +206,7 @@ class ScalaValueReader(partition: PartitionDefinition, 
options: DorisOptions, re
* get next value.
* @return next value
*/
-  def next: AnyRef = {
+  def next: java.util.List[_] = {
 if (!hasNext) {
   logger.error(SHOULD_NOT_HAPPEN_MESSAGE)
   throw new ShouldNeverHappenException

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...

[incubator-doris-flink-connector] 10/32: [Fix] Flink connector support json import and use httpclient to streamlaod (#6740)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 4ef1b10f8c426883530a6307376ae94c5f6cf3f5
Author: wudi <676366...@qq.com>
AuthorDate: Tue Sep 28 04:37:03 2021 -0500

[Fix] Flink connector support json import and use httpclient to streamlaod 
(#6740)

* [Bug]:fix when data null , throw NullPointerException

* [Bug]:Distinguish between null and empty string

* [Feature]:flink-connector supports streamload parameters

* [Fix]:code style

* [Fix]: support json format import and use httpclient to streamload

* [Fix]:remove System out

* [Fix]:upgrade httpclient  version

* [Doc]: add json format import doc

Co-authored-by: wudi 
---
 pom.xml|  15 ++
 .../flink/table/DorisDynamicOutputFormat.java  |  55 ++--
 .../flink/table/DorisDynamicTableFactory.java  |  20 ++-
 .../doris/flink/table/DorisDynamicTableSink.java   |   9 +-
 .../apache/doris/flink/table/DorisStreamLoad.java  | 156 +
 .../org/apache/doris/flink/DorisSinkExample.java   |   4 +-
 6 files changed, 150 insertions(+), 109 deletions(-)

diff --git a/pom.xml b/pom.xml
index bd43778..a0d10f4 100644
--- a/pom.xml
+++ b/pom.xml
@@ -118,6 +118,21 @@
 org.apache.thrift
 libthrift
 ${libthrift.version}
+
+
+httpclient
+org.apache.httpcomponents
+
+
+httpcore
+org.apache.httpcomponents
+
+
+
+
+org.apache.httpcomponents
+httpclient
+4.5.13
 
 
 org.apache.arrow
diff --git 
a/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java 
b/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
index 6ee8834..73c68b6 100644
--- a/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
+++ b/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
@@ -16,6 +16,7 @@
 // under the License.
 package org.apache.doris.flink.table;
 
+import com.fasterxml.jackson.databind.ObjectMapper;
 import org.apache.doris.flink.cfg.DorisExecutionOptions;
 import org.apache.doris.flink.cfg.DorisOptions;
 import org.apache.doris.flink.cfg.DorisReadOptions;
@@ -33,8 +34,10 @@ import org.slf4j.LoggerFactory;
 
 import java.io.IOException;
 import java.util.ArrayList;
-import java.util.Arrays;
+import java.util.HashMap;
 import java.util.List;
+import java.util.Map;
+import java.util.Arrays;
 import java.util.StringJoiner;
 import java.util.concurrent.Executors;
 import java.util.concurrent.ScheduledExecutorService;
@@ -50,35 +53,45 @@ import static 
org.apache.flink.table.data.RowData.createFieldGetter;
 public class DorisDynamicOutputFormat extends RichOutputFormat {
 
 private static final Logger LOG = 
LoggerFactory.getLogger(DorisDynamicOutputFormat.class);
+private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
+
 private static final String FIELD_DELIMITER_KEY = "column_separator";
 private static final String FIELD_DELIMITER_DEFAULT = "\t";
 private static final String LINE_DELIMITER_KEY = "line_delimiter";
 private static final String LINE_DELIMITER_DEFAULT = "\n";
+private static final String FORMAT_KEY = "format";
+private static final String FORMAT_JSON_VALUE = "json";
 private static final String NULL_VALUE = "\\N";
+
 private final String fieldDelimiter;
 private final String lineDelimiter;
-
+private final String[] fieldNames;
+private final boolean jsonFormat;
 private DorisOptions options;
 private DorisReadOptions readOptions;
 private DorisExecutionOptions executionOptions;
 private DorisStreamLoad dorisStreamLoad;
+private final RowData.FieldGetter[] fieldGetters;
 
-
-private final List batch = new ArrayList<>();
+private final List batch = new ArrayList<>();
 private transient volatile boolean closed = false;
 
 private transient ScheduledExecutorService scheduler;
 private transient ScheduledFuture scheduledFuture;
 private transient volatile Exception flushException;
 
-private final RowData.FieldGetter[] fieldGetters;
-
-public DorisDynamicOutputFormat(DorisOptions option, DorisReadOptions 
readOptions, DorisExecutionOptions executionOptions, LogicalType[] 
logicalTypes) {
+public DorisDynamicOutputFormat(DorisOptions option,
+DorisReadOptions readOptions,
+DorisExecutionOptions executionOptions,
+LogicalType[] logicalTypes,
+String[] fieldNames) {
 

[incubator-doris-flink-connector] 11/32: [Dependency] Upgrade thirdparty libs (#6766)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit f6a2e258dcb31e12fd054b10b4bf4116c0004072
Author: Zhengguo Yang 
AuthorDate: Fri Oct 15 13:03:04 2021 +0800

[Dependency] Upgrade thirdparty libs (#6766)

Upgrade the following dependecies:

libevent -> 2.1.12
OpenSSL 1.0.2k -> 1.1.1l
thrift 0.9.3 -> 0.13.0
protobuf 3.5.1 -> 3.14.0
gflags 2.2.0 -> 2.2.2
glog 0.3.3 -> 0.4.0
googletest 1.8.0 -> 1.10.0
snappy 1.1.7 -> 1.1.8
gperftools 2.7 -> 2.9.1
lz4 1.7.5 -> 1.9.3
curl 7.54.1 -> 7.79.0
re2 2017-05-01 -> 2021-02-02
zstd 1.3.7 -> 1.5.0
brotli 1.0.7 -> 1.0.9
flatbuffers 1.10.0 -> 2.0.0
apache-arrow 0.15.1 -> 5.0.0
CRoaring 0.2.60 -> 0.3.4
orc 1.5.8 -> 1.6.6
libdivide 4.0.0 -> 5.0
brpc 0.97 -> 1.0.0-rc02
librdkafka 1.7.0 -> 1.8.0

after this pr compile doris should use build-env:1.4.0
---
 pom.xml| 11 +++--
 .../apache/doris/flink/backend/BackendClient.java  | 28 +++---
 .../doris/flink/datastream/ScalaValueReader.scala  | 20 
 .../doris/flink/serialization/TestRowBatch.java|  6 ++---
 4 files changed, 36 insertions(+), 29 deletions(-)

diff --git a/pom.xml b/pom.xml
index a0d10f4..ffe6784 100644
--- a/pom.xml
+++ b/pom.xml
@@ -11,8 +11,8 @@
 
 2.12
 1.11.2
-0.9.3
-0.15.1
+0.13.0
+5.0.0
 3.8.1
 3.3.0
 3.2.1
@@ -140,6 +140,12 @@
 ${arrow.version}
 
 
+org.apache.arrow
+arrow-memory-netty
+${arrow.version}
+runtime
+
+
 org.slf4j
 slf4j-api
 1.7.25
@@ -196,6 +202,7 @@
 0.1.11
 
 
${doris.thirdparty}/installed/bin/thrift
+java:fullcamel
 
 
 
diff --git a/src/main/java/org/apache/doris/flink/backend/BackendClient.java 
b/src/main/java/org/apache/doris/flink/backend/BackendClient.java
index 40bb5c9..9b8d955 100644
--- a/src/main/java/org/apache/doris/flink/backend/BackendClient.java
+++ b/src/main/java/org/apache/doris/flink/backend/BackendClient.java
@@ -126,14 +126,14 @@ public class BackendClient {
 for (int attempt = 0; attempt < retries; ++attempt) {
 logger.debug("Attempt {} to openScanner {}.", attempt, routing);
 try {
-TScanOpenResult result = client.open_scanner(openParams);
+TScanOpenResult result = client.openScanner(openParams);
 if (result == null) {
 logger.warn("Open scanner result from {} is null.", 
routing);
 continue;
 }
-if 
(!TStatusCode.OK.equals(result.getStatus().getStatus_code())) {
+if 
(!TStatusCode.OK.equals(result.getStatus().getStatusCode())) {
 logger.warn("The status of open scanner result from {} is 
'{}', error message is: {}.",
-routing, result.getStatus().getStatus_code(), 
result.getStatus().getError_msgs());
+routing, result.getStatus().getStatusCode(), 
result.getStatus().getErrorMsgs());
 continue;
 }
 return result;
@@ -163,14 +163,14 @@ public class BackendClient {
 for (int attempt = 0; attempt < retries; ++attempt) {
 logger.debug("Attempt {} to getNext {}.", attempt, routing);
 try {
-result = client.get_next(nextBatchParams);
+result = client.getNext(nextBatchParams);
 if (result == null) {
 logger.warn("GetNext result from {} is null.", routing);
 continue;
 }
-if 
(!TStatusCode.OK.equals(result.getStatus().getStatus_code())) {
+if 
(!TStatusCode.OK.equals(result.getStatus().getStatusCode())) {
 logger.warn("The status of get next result from {} is 
'{}', error message is: {}.",
-routing, result.getStatus().getStatus_code(), 
result.getStatus().getError_msgs());
+routing, result.getStatus().getStatusCode(), 
result.getStatus().getErrorMsgs());
 continue;
 }
 return result;
@@ -179,11 +179,11 @@ public class BackendClient {
 ex = e;
 }
 }
-if (result != null && (TStatusCode.OK != 
(result.getStatus().getStatus_code( {
-logger.error(ErrorMessages.DORIS_INTERNAL_FAIL_MESSAGE, routing, 
result.getStatus().getStatus_code(),
-result.getStatus().getErro

[incubator-doris-flink-connector] 13/32: [Flink]Simplify the use of flink connector (#6892)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 340ac7e72d11ad03c66581d9580eac0a3e08ea71
Author: xiaokangguo <40515802+xiaokang...@users.noreply.github.com>
AuthorDate: Sat Oct 23 18:10:47 2021 +0800

[Flink]Simplify the use of flink connector  (#6892)

1. Simplify the use of flink connector like other stream sink by 
GenericDorisSinkFunction.
2. Add the use cases of flink connector.

## Use case
```
env.fromElements("{\"longitude\": \"116.405419\", \"city\": \"北京\", 
\"latitude\": \"39.916927\"}")
 .addSink(
  DorisSink.sink(
 DorisOptions.builder()
   .setFenodes("FE_IP:8030")
   .setTableIdentifier("db.table")
   .setUsername("root")
   .setPassword("").build()
));
```
---
 .../doris/flink/cfg/DorisExecutionOptions.java |  20 +-
 .../apache/doris/flink/cfg/DorisReadOptions.java   |   5 +
 .../java/org/apache/doris/flink/cfg/DorisSink.java |  95 +
 .../doris/flink/cfg/GenericDorisSinkFunction.java  |  53 +
 .../flink/table/DorisDynamicOutputFormat.java  |  77 +---
 .../doris/flink/DorisOutPutFormatExample.java  |  84 
 .../apache/doris/flink/DorisStreamSinkExample.java | 219 +
 7 files changed, 517 insertions(+), 36 deletions(-)

diff --git 
a/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java 
b/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java
index 3d035ab..587ab07 100644
--- a/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java
+++ b/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java
@@ -20,7 +20,6 @@ package org.apache.doris.flink.cfg;
 import org.apache.flink.util.Preconditions;
 
 import java.io.Serializable;
-import java.time.Duration;
 import java.util.Properties;
 
 /**
@@ -29,6 +28,10 @@ import java.util.Properties;
 public class DorisExecutionOptions implements Serializable {
 private static final long serialVersionUID = 1L;
 
+public static final Integer DEFAULT_BATCH_SIZE = 1000;
+public static final Integer DEFAULT_MAX_RETRY_TIMES = 3;
+private static final Long DEFAULT_INTERVAL_MILLIS = 1L;
+
 private final Integer batchSize;
 private final Integer maxRetries;
 private final Long batchIntervalMs;
@@ -66,14 +69,21 @@ public class DorisExecutionOptions implements Serializable {
 return new Builder();
 }
 
+public static DorisExecutionOptions defaults() {
+Properties pro = new Properties();
+pro.setProperty("format", "json");
+pro.setProperty("strip_outer_array", "true");
+return new Builder().setStreamLoadProp(pro).build();
+}
+
 /**
  * Builder of {@link DorisExecutionOptions}.
  */
 public static class Builder {
-private Integer batchSize;
-private Integer maxRetries;
-private Long batchIntervalMs;
-private Properties streamLoadProp;
+private Integer batchSize = DEFAULT_BATCH_SIZE;
+private Integer maxRetries = DEFAULT_MAX_RETRY_TIMES;
+private Long batchIntervalMs = DEFAULT_INTERVAL_MILLIS;
+private Properties streamLoadProp = new Properties();
 
 public Builder setBatchSize(Integer batchSize) {
 this.batchSize = batchSize;
diff --git a/src/main/java/org/apache/doris/flink/cfg/DorisReadOptions.java 
b/src/main/java/org/apache/doris/flink/cfg/DorisReadOptions.java
index 53cefaa..0beb18c 100644
--- a/src/main/java/org/apache/doris/flink/cfg/DorisReadOptions.java
+++ b/src/main/java/org/apache/doris/flink/cfg/DorisReadOptions.java
@@ -103,6 +103,10 @@ public class DorisReadOptions implements Serializable {
 return new Builder();
 }
 
+public static DorisReadOptions defaults(){
+return DorisReadOptions.builder().build();
+}
+
 /**
  * Builder of {@link DorisReadOptions}.
  */
@@ -179,6 +183,7 @@ public class DorisReadOptions implements Serializable {
 public DorisReadOptions build() {
 return new DorisReadOptions(readFields, filterQuery, 
requestTabletSize, requestConnectTimeoutMs, requestReadTimeoutMs, 
requestQueryTimeoutS, requestRetries, requestBatchSize, execMemLimit, 
deserializeQueueSize, deserializeArrowAsync);
 }
+
 }
 
 
diff --git a/src/main/java/org/apache/doris/flink/cfg/DorisSink.java 
b/src/main/java/org/apache/doris/flink/cfg/DorisSink.java
new file mode 100644
index 000..f11c587
--- /dev/null
+++ b/src/main/java/org/apache/doris/flink/cfg/DorisSink.java
@@ -0,0 +1,95 @@
+package org.apache.doris.flink.cfg;
+
+import org.apache.doris.flink.table.DorisDynamicOutputFormat;
+import org.apache.flink.streaming.api.functions.sink.SinkFunction;
+import org.apache.flink.table.types

[incubator-doris-flink-connector] 15/32: [HTTP][API] Add backends info API for spark/flink connector (#6984)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 6f1474e7f7bc57f49320dd4468f6d08aba612919
Author: Mingyu Chen 
AuthorDate: Fri Nov 5 09:43:06 2021 +0800

[HTTP][API] Add backends info API for spark/flink connector (#6984)

Doris should provide a http api to return backends list for connectors to 
submit stream load,
and without privilege checking, which can let common user to use it
---
 .../org/apache/doris/flink/rest/RestService.java   | 64 --
 .../apache/doris/flink/rest/models/Backend.java|  1 +
 .../apache/doris/flink/rest/models/BackendRow.java |  1 +
 .../rest/models/{Backend.java => BackendV2.java}   | 47 +---
 4 files changed, 102 insertions(+), 11 deletions(-)

diff --git a/src/main/java/org/apache/doris/flink/rest/RestService.java 
b/src/main/java/org/apache/doris/flink/rest/RestService.java
index 184afd3..1e6310c 100644
--- a/src/main/java/org/apache/doris/flink/rest/RestService.java
+++ b/src/main/java/org/apache/doris/flink/rest/RestService.java
@@ -32,6 +32,7 @@ import org.apache.doris.flink.exception.DorisException;
 import org.apache.doris.flink.exception.ShouldNeverHappenException;
 import org.apache.doris.flink.rest.models.Backend;
 import org.apache.doris.flink.rest.models.BackendRow;
+import org.apache.doris.flink.rest.models.BackendV2;
 import org.apache.doris.flink.rest.models.QueryPlan;
 import org.apache.doris.flink.rest.models.Schema;
 import org.apache.doris.flink.rest.models.Tablet;
@@ -83,7 +84,9 @@ public class RestService implements Serializable {
 private static final String API_PREFIX = "/api";
 private static final String SCHEMA = "_schema";
 private static final String QUERY_PLAN = "_query_plan";
+@Deprecated
 private static final String BACKENDS = "/rest/v1/system?path=//backends";
+private static final String BACKENDS_V2 = "/api/backends?is_aliva=true";
 private static final String FE_LOGIN = "/rest/v1/login";
 
 /**
@@ -250,25 +253,29 @@ public class RestService implements Serializable {
  */
 @VisibleForTesting
 public static String randomBackend(DorisOptions options, DorisReadOptions 
readOptions, Logger logger) throws DorisException, IOException {
-List backends = getBackends(options, readOptions, logger);
+List backends = getBackendsV2(options, 
readOptions, logger);
 logger.trace("Parse beNodes '{}'.", backends);
 if (backends == null || backends.isEmpty()) {
 logger.error(ILLEGAL_ARGUMENT_MESSAGE, "beNodes", backends);
 throw new IllegalArgumentException("beNodes", 
String.valueOf(backends));
 }
 Collections.shuffle(backends);
-BackendRow backend = backends.get(0);
-return backend.getIP() + ":" + backend.getHttpPort();
+BackendV2.BackendRowV2 backend = backends.get(0);
+return backend.getIp() + ":" + backend.getHttpPort();
 }
 
 /**
- * get  Doris BE nodes to request.
+ * get Doris BE nodes to request.
  *
  * @param options configuration of request
  * @param logger  slf4j logger
  * @return the chosen one Doris BE node
  * @throws IllegalArgumentException BE nodes is illegal
+ *
+ * This method is deprecated. Because it needs ADMIN_PRIV to get backends, 
which is not suitable for common users.
+ * Use getBackendsV2 instead
  */
+@Deprecated
 @VisibleForTesting
 static List getBackends(DorisOptions options, DorisReadOptions 
readOptions, Logger logger) throws DorisException, IOException {
 String feNodes = options.getFenodes();
@@ -281,6 +288,7 @@ public class RestService implements Serializable {
 return backends;
 }
 
+@Deprecated
 static List parseBackend(String response, Logger logger) 
throws DorisException, IOException {
 ObjectMapper mapper = new ObjectMapper();
 Backend backend;
@@ -310,6 +318,54 @@ public class RestService implements Serializable {
 }
 
 /**
+ * get Doris BE nodes to request.
+ *
+ * @param options configuration of request
+ * @param logger  slf4j logger
+ * @return the chosen one Doris BE node
+ * @throws IllegalArgumentException BE nodes is illegal
+ */
+@VisibleForTesting
+static List getBackendsV2(DorisOptions options, 
DorisReadOptions readOptions, Logger logger) throws DorisException, IOException 
{
+String feNodes = options.getFenodes();
+String feNode = randomEndpoint(feNodes, logger);
+String beUrl = "http://"; + feNode + BACKENDS_V2;
+HttpGet httpGet = new HttpGet(beUrl);
+String response = send(options, readOptions, httpGet, logger);
+logger.info("Backend Info:{}", response);
+List backends = parseBackendV2(response, 
logger);
+return backends;
+}
+
+sta

[incubator-doris-flink-connector] 23/32: [improvement](spark-connector)(flink-connector) Modify the max num of batch written by Spark/Flink connector each time. (#7485)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 6cf330973a715aadf353cf27f19635e0546cbc78
Author: jiafeng.zhang 
AuthorDate: Sun Dec 26 11:13:47 2021 +0800

[improvement](spark-connector)(flink-connector) Modify the max num of batch 
written by Spark/Flink connector each time. (#7485)

Increase the default batch size and flush interval
---
 src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git 
a/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java 
b/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java
index 587ab07..47bb517 100644
--- a/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java
+++ b/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java
@@ -28,8 +28,8 @@ import java.util.Properties;
 public class DorisExecutionOptions implements Serializable {
 private static final long serialVersionUID = 1L;
 
-public static final Integer DEFAULT_BATCH_SIZE = 1000;
-public static final Integer DEFAULT_MAX_RETRY_TIMES = 3;
+public static final Integer DEFAULT_BATCH_SIZE = 1;
+public static final Integer DEFAULT_MAX_RETRY_TIMES = 1;
 private static final Long DEFAULT_INTERVAL_MILLIS = 1L;
 
 private final Integer batchSize;

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-flink-connector] 31/32: [init] init commit

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit cb1a419a1d26f13c7963be5a3a6ee783bbb5fe7a
Author: morningman 
AuthorDate: Fri Feb 11 22:59:52 2022 +0800

[init] init commit

Move flink-doris-connector from incubator-doris@df2c756
---
 build.sh => flink-doris-connector/build.sh| 0
 pom.xml => flink-doris-connector/pom.xml  | 0
 .../src}/main/java/org/apache/doris/flink/backend/BackendClient.java  | 0
 .../src}/main/java/org/apache/doris/flink/cfg/ConfigurationOptions.java   | 0
 .../src}/main/java/org/apache/doris/flink/cfg/DorisConnectionOptions.java | 0
 .../src}/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java  | 0
 .../src}/main/java/org/apache/doris/flink/cfg/DorisOptions.java   | 0
 .../src}/main/java/org/apache/doris/flink/cfg/DorisReadOptions.java   | 0
 .../src}/main/java/org/apache/doris/flink/cfg/DorisSink.java  | 0
 .../src}/main/java/org/apache/doris/flink/cfg/DorisStreamOptions.java | 0
 .../main/java/org/apache/doris/flink/cfg/GenericDorisSinkFunction.java| 0
 .../main/java/org/apache/doris/flink/datastream/DorisSourceFunction.java  | 0
 .../apache/doris/flink/deserialization/DorisDeserializationSchema.java| 0
 .../doris/flink/deserialization/SimpleListDeserializationSchema.java  | 0
 .../java/org/apache/doris/flink/exception/ConnectedFailedException.java   | 0
 .../src}/main/java/org/apache/doris/flink/exception/DorisException.java   | 0
 .../java/org/apache/doris/flink/exception/DorisInternalException.java | 0
 .../java/org/apache/doris/flink/exception/IllegalArgumentException.java   | 0
 .../java/org/apache/doris/flink/exception/ShouldNeverHappenException.java | 0
 .../main/java/org/apache/doris/flink/exception/StreamLoadException.java   | 0
 .../src}/main/java/org/apache/doris/flink/rest/PartitionDefinition.java   | 0
 .../src}/main/java/org/apache/doris/flink/rest/RestService.java   | 0
 .../src}/main/java/org/apache/doris/flink/rest/SchemaUtils.java   | 0
 .../src}/main/java/org/apache/doris/flink/rest/models/Backend.java| 0
 .../src}/main/java/org/apache/doris/flink/rest/models/BackendRow.java | 0
 .../src}/main/java/org/apache/doris/flink/rest/models/BackendV2.java  | 0
 .../src}/main/java/org/apache/doris/flink/rest/models/Field.java  | 0
 .../src}/main/java/org/apache/doris/flink/rest/models/QueryPlan.java  | 0
 .../src}/main/java/org/apache/doris/flink/rest/models/RespContent.java| 0
 .../src}/main/java/org/apache/doris/flink/rest/models/Schema.java | 0
 .../src}/main/java/org/apache/doris/flink/rest/models/Tablet.java | 0
 .../src}/main/java/org/apache/doris/flink/serialization/Routing.java  | 0
 .../src}/main/java/org/apache/doris/flink/serialization/RowBatch.java | 0
 .../main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java  | 0
 .../main/java/org/apache/doris/flink/table/DorisDynamicTableFactory.java  | 0
 .../main/java/org/apache/doris/flink/table/DorisDynamicTableSink.java | 0
 .../main/java/org/apache/doris/flink/table/DorisDynamicTableSource.java   | 0
 .../main/java/org/apache/doris/flink/table/DorisRowDataInputFormat.java   | 0
 .../src}/main/java/org/apache/doris/flink/table/DorisStreamLoad.java  | 0
 .../src}/main/java/org/apache/doris/flink/table/DorisTableInputSplit.java | 0
 .../src}/main/java/org/apache/doris/flink/util/ErrorMessages.java | 0
 .../src}/main/java/org/apache/doris/flink/util/IOUtils.java   | 0
 .../resources/META-INF/services/org.apache.flink.table.factories.Factory  | 0
 {src => flink-doris-connector/src}/main/resources/log4j.properties| 0
 .../main/scala/org/apache/doris/flink/datastream/ScalaValueReader.scala   | 0
 .../src}/main/thrift/doris/DorisExternalService.thrift| 0
 {src => flink-doris-connector/src}/main/thrift/doris/Status.thrift| 0
 {src => flink-doris-connector/src}/main/thrift/doris/Types.thrift | 0
 .../src}/test/java/org/apache/doris/flink/DorisOutPutFormatExample.java   | 0
 .../src}/test/java/org/apache/doris/flink/DorisSinkExample.java   | 0
 .../src}/test/java/org/apache/doris/flink/DorisSourceDataStream.java  | 0
 .../src}/test/java/org/apache/doris/flink/DorisSourceExample.java | 0
 .../src}/test/java/org/apache/doris/flink/DorisSourceSinkExample.java | 0
 .../src}/test/java/org/apache/doris/flink/DorisStreamSinkExample.java | 0
 .../src}/test/java/org/apache/doris/flink/serialization/TestRowBatch.java | 0
 55 files changed, 0 insertions(+), 0 deletions(-)

diff --git a/build.sh b/flink-doris-connector/build.sh
similarity index 100%
rename from build.sh
rename to flink-doris-connector/build.sh
diff --git a/pom.xml b/flink-doris-connector/pom.xml
simil

[incubator-doris-flink-connector] 30/32: [fix](httpv2) make http v2 and v1 interface compatible (#7848)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit ea65872e1bf12e9a68c55bd909ec812c9b9696c8
Author: jiafeng.zhang 
AuthorDate: Mon Jan 31 22:12:34 2022 +0800

[fix](httpv2) make http v2 and v1 interface compatible (#7848)

http v2 TableSchemaAction adds the return value of aggregation_type,
and modifies the corresponding code of Flink/Spark Connector
---
 src/main/java/org/apache/doris/flink/rest/SchemaUtils.java   |  2 +-
 src/main/java/org/apache/doris/flink/rest/models/Field.java  | 12 +++-
 src/main/java/org/apache/doris/flink/rest/models/Schema.java |  4 ++--
 .../org/apache/doris/flink/serialization/TestRowBatch.java   |  4 ++--
 4 files changed, 16 insertions(+), 6 deletions(-)

diff --git a/src/main/java/org/apache/doris/flink/rest/SchemaUtils.java 
b/src/main/java/org/apache/doris/flink/rest/SchemaUtils.java
index 13bde01..5c64556 100644
--- a/src/main/java/org/apache/doris/flink/rest/SchemaUtils.java
+++ b/src/main/java/org/apache/doris/flink/rest/SchemaUtils.java
@@ -33,7 +33,7 @@ public class SchemaUtils {
  */
 public static Schema convertToSchema(List 
tscanColumnDescs) {
 Schema schema = new Schema(tscanColumnDescs.size());
-tscanColumnDescs.stream().forEach(desc -> schema.put(new 
Field(desc.getName(), desc.getType().name(), "", 0, 0)));
+tscanColumnDescs.stream().forEach(desc -> schema.put(new 
Field(desc.getName(), desc.getType().name(), "", 0, 0, "")));
 return schema;
 }
 }
diff --git a/src/main/java/org/apache/doris/flink/rest/models/Field.java 
b/src/main/java/org/apache/doris/flink/rest/models/Field.java
index 9a58180..04341bf 100644
--- a/src/main/java/org/apache/doris/flink/rest/models/Field.java
+++ b/src/main/java/org/apache/doris/flink/rest/models/Field.java
@@ -25,16 +25,26 @@ public class Field {
 private String comment;
 private int precision;
 private int scale;
+private String aggregation_type;
 
 public Field() {
 }
 
-public Field(String name, String type, String comment, int precision, int 
scale) {
+public Field(String name, String type, String comment, int precision, int 
scale, String aggregation_type) {
 this.name = name;
 this.type = type;
 this.comment = comment;
 this.precision = precision;
 this.scale = scale;
+this.aggregation_type = aggregation_type;
+}
+
+public String getAggregation_type() {
+return aggregation_type;
+}
+
+public void setAggregation_type(String aggregation_type) {
+this.aggregation_type = aggregation_type;
 }
 
 public String getName() {
diff --git a/src/main/java/org/apache/doris/flink/rest/models/Schema.java 
b/src/main/java/org/apache/doris/flink/rest/models/Schema.java
index e274352..264e736 100644
--- a/src/main/java/org/apache/doris/flink/rest/models/Schema.java
+++ b/src/main/java/org/apache/doris/flink/rest/models/Schema.java
@@ -58,8 +58,8 @@ public class Schema {
 this.properties = properties;
 }
 
-public void put(String name, String type, String comment, int scale, int 
precision) {
-properties.add(new Field(name, type, comment, scale, precision));
+public void put(String name, String type, String comment, int scale, int 
precision, String aggregation_type) {
+properties.add(new Field(name, type, comment, scale, precision, 
aggregation_type));
 }
 
 public void put(Field f) {
diff --git 
a/src/test/java/org/apache/doris/flink/serialization/TestRowBatch.java 
b/src/test/java/org/apache/doris/flink/serialization/TestRowBatch.java
index ac19066..0f45aaa 100644
--- a/src/test/java/org/apache/doris/flink/serialization/TestRowBatch.java
+++ b/src/test/java/org/apache/doris/flink/serialization/TestRowBatch.java
@@ -232,8 +232,8 @@ public class TestRowBatch {
 + 
"\"name\":\"k4\",\"comment\":\"\"},{\"type\":\"FLOAT\",\"name\":\"k9\",\"comment\":\"\"},"
 + 
"{\"type\":\"DOUBLE\",\"name\":\"k8\",\"comment\":\"\"},{\"type\":\"DATE\",\"name\":\"k10\","
 + 
"\"comment\":\"\"},{\"type\":\"DATETIME\",\"name\":\"k11\",\"comment\":\"\"},"
-+ "{\"name\":\"k5\",\"scale\":\"9\",\"comment\":\"\","
-+ 
"\"type\":\"DECIMAL\",\"precision\":\"2\"},{\"type\":\"CHAR\",\"name\":\"k6\",\"comment\":\"\"}],"
++ "{\"name\":\"k5\",\"scale\":\"0\",\"comment\":\"\","
++ 
"\"type\":\"DECIMAL\",\"precision\":\"9\",\"aggregation_type\":\"\"},{\"type\":\"CHAR\",\"name\":\"k6\",\"comment\":\"\",\"aggregation_type\":\"REPLACE_IF_NOT_NULL\"}],"
 + "\"status\":200}";
 
 Schema schema = RestService.parseSchema(schemaStr, logger);

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

[incubator-doris-flink-connector] 18/32: [License] Add License header for missing files (#7130)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 87ccd3e10914b13bd6abf56c28bbeb7dcbc19bab
Author: Mingyu Chen 
AuthorDate: Tue Nov 16 18:37:54 2021 +0800

[License] Add License header for missing files (#7130)

1. Add License header for missing files
2. Modify the spark pom.xml to correct the location of `thrift`
---
 pom.xml  | 20 
 .../java/org/apache/doris/flink/cfg/DorisSink.java   | 17 +
 .../doris/flink/cfg/GenericDorisSinkFunction.java| 19 ++-
 .../apache/doris/flink/DorisStreamSinkExample.java   | 19 ++-
 4 files changed, 73 insertions(+), 2 deletions(-)

diff --git a/pom.xml b/pom.xml
index c5c3905..3cdd0db 100644
--- a/pom.xml
+++ b/pom.xml
@@ -1,4 +1,24 @@
 
+
+
+
 http://maven.apache.org/POM/4.0.0";
  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
diff --git a/src/main/java/org/apache/doris/flink/cfg/DorisSink.java 
b/src/main/java/org/apache/doris/flink/cfg/DorisSink.java
index f11c587..2c3db4c 100644
--- a/src/main/java/org/apache/doris/flink/cfg/DorisSink.java
+++ b/src/main/java/org/apache/doris/flink/cfg/DorisSink.java
@@ -1,3 +1,20 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
 package org.apache.doris.flink.cfg;
 
 import org.apache.doris.flink.table.DorisDynamicOutputFormat;
diff --git 
a/src/main/java/org/apache/doris/flink/cfg/GenericDorisSinkFunction.java 
b/src/main/java/org/apache/doris/flink/cfg/GenericDorisSinkFunction.java
index 92dd300..6be6aa4 100644
--- a/src/main/java/org/apache/doris/flink/cfg/GenericDorisSinkFunction.java
+++ b/src/main/java/org/apache/doris/flink/cfg/GenericDorisSinkFunction.java
@@ -1,3 +1,20 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
 package org.apache.doris.flink.cfg;
 
 import org.apache.doris.flink.table.DorisDynamicOutputFormat;
@@ -50,4 +67,4 @@ public class GenericDorisSinkFunction extends 
RichSinkFunction
 super.close();
 }
 
-}
\ No newline at end of file
+}
diff --git a/src/test/java/org/apache/doris/flink/DorisStreamSinkExample.java 
b/src/test/java/org/apache/doris/flink/DorisStreamSinkExample.java
index d37fd0d..cf35db6 100644
--- a/src/test/java/org/apache/doris/flink/DorisStreamSinkExample.java
+++ b/src/test/java/org/apache/doris/flink/DorisStreamSinkExample.java
@@ -1,3 +1,20 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
 packa

[incubator-doris-flink-connector] branch master created (now d25928d)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a change to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git.


  at d25928d  [init] do some init work

This branch includes the following new commits:

 new d499ac5  [Feature] Flink Doris Connector (#5372) (#5375)
 new 3618f40  [Bug] Modify spark, flink doris connector to send request to 
FE, fix the problem of POST method, it should be the same as the method when 
sending the request (#5788)
 new d07c904  [Log] Fix a mistake in DorisDynamicOutputFormat.java (#5963)
 new 31f359c  [FlinkConnector] Support time interval for flink connector 
(#5934)
 new aa4013d  [Bug][Flink] Fix when data null , flink-connector throw 
NullPointerException (#6165)
 new ba29ce0  [Feature]:Flink-connector supports streamload parameters 
(#6243)
 new cea6391  [Doc] flink/spark connector: add sources/javadoc plugins 
(#6435)
 new 0fa1220  [FlinkConnector] Make flink datastream source parameterized 
(#6473)
 new 001b52c  [Flink] Fix bug of flink doris connector (#6655)
 new 4ef1b10  [Fix] Flink connector support json import and use httpclient 
to streamlaod (#6740)
 new f6a2e25  [Dependency] Upgrade thirdparty libs (#6766)
 new c6bb1b4  [Flink][Bug] Fix potential NPE when cancel 
DorisSourceFunction (#6838)
 new 340ac7e  [Flink]Simplify the use of flink connector  (#6892)
 new bbfb03c  support use char like \x01 in flink-doris-sink column & line 
delimiter (#6937)
 new 6f1474e  [HTTP][API] Add backends info API for spark/flink connector 
(#6984)
 new 86d3e56  [Build]Compile and output the jar file, add Spark, Flink 
version and Scala version (#7051)
 new 8b0b9c6  [Feature] Support Flink and Spark connector support String 
type (#7075)
 new 87ccd3e  [License] Add License header for missing files (#7130)
 new 72a6afc  [chore][community](github) Remove travis and add github 
action (#7380)
 new 49cc46e  [improvement](flink-connector) DataSourceFunction read doris 
supports parallel (#7232)
 new 03120e5  [fix](flink-connector) Connector should visit the surviving 
BE nodes (#7435)
 new 6c0627a  [improvement](flink-connector) flush data without multi 
httpclients (#7329) (#7450)
 new 6cf3309  [improvement](spark-connector)(flink-connector) Modify the 
max num of batch written by Spark/Flink connector each time. (#7485)
 new f66a602  [refactor] Standardize the writing of pom files, prepare for 
deployment to maven (#7477)
 new 58a7271  [refactor] update parent pom  version and optimize build 
scripts (#7548)
 new 1e144ad  [chore][docs] add deploy spark/flink connectors to maven 
release repo docs (#7616)
 new 06f58f2  Flink / Spark connector compilation problem (#7725)
 new 021e281  [Feature][flink-connector] support flink  delete option 
(#7457)
 new 73b3d5c  [chore][fix][doc](fe-plugin)(mysqldump) fix build auditlog 
plugin error (#7804)
 new ea65872  [fix](httpv2) make http v2 and v1 interface compatible (#7848)
 new cb1a419  [init] init commit
 new d25928d  [init] do some init work

The 32 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-flink-connector] 32/32: [init] do some init work

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit d25928da7c7c66ba140d49cd478724ec7f57da55
Author: morningman 
AuthorDate: Fri Feb 11 23:07:49 2022 +0800

[init] do some init work
---
 .github/ISSUE_TEMPLATE/bug_report.yml   | 109 +++
 .github/ISSUE_TEMPLATE/config.yml   |  23 
 .github/ISSUE_TEMPLATE/enhancement.yml  |  76 +++
 .github/ISSUE_TEMPLATE/feature-request.yml  |  78 +++
 .github/PULL_REQUEST_TEMPLATE.md|  19 +++
 .github/workflows/approve-label-trigger.yml |  28 
 .github/workflows/approve-label.yml |  67 +
 .github/workflows/build-extension.yml   |  60 +
 .github/workflows/license-eyes.yml  |  35 +
 .gitignore  |   4 +
 .licenserc.yaml |  36 +
 CODE_OF_CONDUCT.md  |  95 +
 CONTRIBUTING.md |  24 
 CONTRIBUTING_CN.md  |  24 
 DISCLAIMER  |  12 ++
 LICENSE-dependencies.txt|  27 
 LICENSE.txt | 202 
 NOTICE.txt  |   5 +
 README.md   |  49 +++
 custom_env.sh.tpl   |   3 +
 env.sh  |  91 +
 flink-doris-connector/build.sh  |  20 +--
 flink-doris-connector/pom.xml   |  20 ++-
 23 files changed, 1087 insertions(+), 20 deletions(-)

diff --git a/.github/ISSUE_TEMPLATE/bug_report.yml 
b/.github/ISSUE_TEMPLATE/bug_report.yml
new file mode 100644
index 000..acabcb1
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/bug_report.yml
@@ -0,0 +1,109 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+name: Doris Bug report
+title: "[Bug] "
+description: Problems and issues with code of Apache Doris
+labels: [ "kind/bug" ]
+body:
+  - type: markdown
+attributes:
+  value: |
+Thank you very much for submitting feedback to Doris to help Doris 
develop better.
+
+If it is an idea or help wanted, please go to:
+
+1. [Dev Mail List](mailto:d...@doris.apache.org): This will be your 
FASTEST way to get help![How to 
subscribe](mailto:dev-subscr...@doris.apache.org)
+2. [Github 
Discussion](https://github.com/apache/incubator-doris/discussions)
+
+  - type: checkboxes
+attributes:
+  label: Search before asking
+  description: >
+Please make sure to search in the 
[issues](https://github.com/apache/incubator-doris/issues?q=is%3Aissue) first 
to see
+whether the same issue was reported already.
+  options:
+- label: >
+I had searched in the 
[issues](https://github.com/apache/incubator-doris/issues?q=is%3Aissue) and 
found no similar
+issues.
+  required: true
+
+  - type: textarea
+attributes:
+  label: Version
+  description: What is the current version 
+  placeholder: >
+Please provide the version you are using.
+If it is the trunk version, please input commit id.
+validations:
+  required: true
+
+  - type: textarea
+attributes:
+  label: What's Wrong?
+  description: Describe the bug.
+  placeholder: >
+Describe the specific problem, the more detailed the better.
+validations:
+  required: true
+
+  - type: textarea
+attributes:
+  label: What You Expected?
+validations:
+  required: true
+
+  - type: textarea
+attributes:
+  label: How to Reproduce?
+  placeholder: >
+Please try to give reproducing steps to facilitate quick location of 
the problem.
+
+- What actions were performed
+- Table building statement
+- Import statement
+- Cluster information: number of nodes, configuration, etc.
+
+If it is hard to reproduce, please also explain the general scene.
+
+  - type: textarea
+attributes:
+  label: Anything Else?
+
+  -

[incubator-doris-flink-connector] 12/32: [Flink][Bug] Fix potential NPE when cancel DorisSourceFunction (#6838)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit c6bb1b44a952c1effbb6dcb6dcd92884b7565b26
Author: Yun Tang 
AuthorDate: Sat Oct 23 16:45:24 2021 +0800

[Flink][Bug] Fix potential NPE when cancel DorisSourceFunction (#6838)

Fix potential NPE of `scalaValueReader` when cancelling DorisSourceFunction.
---
 .../doris/flink/datastream/DorisSourceFunction.java   | 19 +--
 .../doris/flink/datastream/ScalaValueReader.scala |  3 +--
 2 files changed, 14 insertions(+), 8 deletions(-)

diff --git 
a/src/main/java/org/apache/doris/flink/datastream/DorisSourceFunction.java 
b/src/main/java/org/apache/doris/flink/datastream/DorisSourceFunction.java
index 08ec5b0..85f5f6b 100644
--- a/src/main/java/org/apache/doris/flink/datastream/DorisSourceFunction.java
+++ b/src/main/java/org/apache/doris/flink/datastream/DorisSourceFunction.java
@@ -43,8 +43,8 @@ public class DorisSourceFunction extends 
RichSourceFunction> implements
 private final DorisDeserializationSchema> deserializer;
 private final DorisOptions options;
 private final DorisReadOptions readOptions;
+private transient volatile boolean isRunning;
 private List dorisPartitions;
-private ScalaValueReader scalaValueReader;
 
 public DorisSourceFunction(DorisStreamOptions streamOptions, 
DorisDeserializationSchema> deserializer) {
 this.deserializer = deserializer;
@@ -55,25 +55,32 @@ public class DorisSourceFunction extends 
RichSourceFunction> implements
 @Override
 public void open(Configuration parameters) throws Exception {
 super.open(parameters);
+this.isRunning = true;
 this.dorisPartitions = RestService.findPartitions(options, 
readOptions, logger);
 }
 
 @Override
 public void run(SourceContext> sourceContext) {
 for (PartitionDefinition partitions : dorisPartitions) {
-scalaValueReader = new ScalaValueReader(partitions, options, 
readOptions);
-while (scalaValueReader.hasNext()) {
-List next = scalaValueReader.next();
-sourceContext.collect(next);
+try (ScalaValueReader scalaValueReader = new 
ScalaValueReader(partitions, options, readOptions)) {
+while (isRunning && scalaValueReader.hasNext()) {
+List next = scalaValueReader.next();
+sourceContext.collect(next);
+}
 }
 }
 }
 
 @Override
 public void cancel() {
-scalaValueReader.close();
+isRunning = false;
 }
 
+@Override
+public void close() throws Exception {
+super.close();
+isRunning = false;
+}
 
 @Override
 public TypeInformation> getProducedType() {
diff --git 
a/src/main/scala/org/apache/doris/flink/datastream/ScalaValueReader.scala 
b/src/main/scala/org/apache/doris/flink/datastream/ScalaValueReader.scala
index 093390d..06df2ef 100644
--- a/src/main/scala/org/apache/doris/flink/datastream/ScalaValueReader.scala
+++ b/src/main/scala/org/apache/doris/flink/datastream/ScalaValueReader.scala
@@ -19,7 +19,6 @@ package org.apache.doris.flink.datastream
 
 import java.util.concurrent._
 import java.util.concurrent.atomic.AtomicBoolean
-
 import org.apache.doris.flink.backend.BackendClient
 import org.apache.doris.flink.cfg.ConfigurationOptions._
 import org.apache.doris.flink.cfg.{DorisOptions, DorisReadOptions}
@@ -41,7 +40,7 @@ import scala.util.control.Breaks
  * @param partition Doris RDD partition
  * @param options request configuration
  */
-class ScalaValueReader(partition: PartitionDefinition, options: DorisOptions, 
readOptions: DorisReadOptions) {
+class ScalaValueReader(partition: PartitionDefinition, options: DorisOptions, 
readOptions: DorisReadOptions) extends AutoCloseable {
   protected val logger = Logger.getLogger(classOf[ScalaValueReader])
 
   protected val client = new BackendClient(new 
Routing(partition.getBeAddress), readOptions)

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-flink-connector] 26/32: [chore][docs] add deploy spark/flink connectors to maven release repo docs (#7616)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 1e144ad6a703695a7c02559b7e465c0119886034
Author: Zhengguo Yang 
AuthorDate: Thu Jan 6 23:23:33 2022 +0800

[chore][docs] add deploy spark/flink connectors to maven release repo docs 
(#7616)
---
 pom.xml | 7 ---
 1 file changed, 4 insertions(+), 3 deletions(-)

diff --git a/pom.xml b/pom.xml
index a819733..11f2b20 100644
--- a/pom.xml
+++ b/pom.xml
@@ -41,9 +41,9 @@ under the License.
 
 
 
-
scm:git:g...@github.com:apache/incubator-doris.git
-
scm:git:g...@github.com:apache/incubator-doris.git
-scm:git:g...@github.com:apache/incubator-doris.git
+
scm:git:https://g...@github.com/apache/incubator-doris.git
+
scm:git:https://g...@github.com/apache/incubator-doris.git
+scm:git:https://g...@github.com/apache/incubator-doris.git
 HEAD
 
 
@@ -76,6 +76,7 @@ under the License.
 3.2.1
 UTF-8
 ${env.DORIS_THIRDPARTY}
+github
 
 
 

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-flink-connector] 20/32: [improvement](flink-connector) DataSourceFunction read doris supports parallel (#7232)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 49cc46e2992edb0179b5ff14e47939752939162a
Author: wudi <676366...@qq.com>
AuthorDate: Wed Dec 15 16:21:29 2021 +0800

[improvement](flink-connector) DataSourceFunction read doris supports 
parallel (#7232)

The previous DataSourceFunction inherited from RichSourceFunction.
As a result, no matter how much the parallelism of flink is set, the 
parallelism of DataSourceFunction is only 1.
Now modify it to RichParallelSourceFunction.

And when flink has multiple degrees of parallelism, assign the doris data 
to each parallelism.
For example, read dorisPartitions.size = 10, flink.parallelism = 4
The task is split as follows:
task0: dorisPartitions[0],[4],[8]
task1: dorisPartitions[1],[5],[9]
task2: dorisPartitions[2],[6]
task3: dorisPartitions[3],[7]
---
 .../flink/datastream/DorisSourceFunction.java  | 32 +++---
 .../org/apache/doris/flink/rest/RestService.java   |  4 +--
 .../doris/flink/table/DorisDynamicTableSource.java |  2 +-
 .../org/apache/doris/flink/DorisSourceExample.java |  2 +-
 4 files changed, 32 insertions(+), 8 deletions(-)

diff --git 
a/src/main/java/org/apache/doris/flink/datastream/DorisSourceFunction.java 
b/src/main/java/org/apache/doris/flink/datastream/DorisSourceFunction.java
index 85f5f6b..edde953 100644
--- a/src/main/java/org/apache/doris/flink/datastream/DorisSourceFunction.java
+++ b/src/main/java/org/apache/doris/flink/datastream/DorisSourceFunction.java
@@ -20,12 +20,14 @@ import org.apache.doris.flink.cfg.DorisOptions;
 import org.apache.doris.flink.cfg.DorisReadOptions;
 import org.apache.doris.flink.cfg.DorisStreamOptions;
 import org.apache.doris.flink.deserialization.DorisDeserializationSchema;
+import org.apache.doris.flink.exception.DorisException;
 import org.apache.doris.flink.rest.PartitionDefinition;
 import org.apache.doris.flink.rest.RestService;
 import org.apache.flink.api.common.typeinfo.TypeInformation;
 import org.apache.flink.api.java.typeutils.ResultTypeQueryable;
 import org.apache.flink.configuration.Configuration;
-import org.apache.flink.streaming.api.functions.source.RichSourceFunction;
+import org.apache.flink.shaded.guava18.com.google.common.collect.Lists;
+import 
org.apache.flink.streaming.api.functions.source.RichParallelSourceFunction;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
@@ -36,7 +38,7 @@ import java.util.List;
  * DorisSource
  **/
 
-public class DorisSourceFunction extends RichSourceFunction> 
implements ResultTypeQueryable> {
+public class DorisSourceFunction extends RichParallelSourceFunction> 
implements ResultTypeQueryable> {
 
 private static final Logger logger = 
LoggerFactory.getLogger(DorisSourceFunction.class);
 
@@ -45,23 +47,45 @@ public class DorisSourceFunction extends 
RichSourceFunction> implements
 private final DorisReadOptions readOptions;
 private transient volatile boolean isRunning;
 private List dorisPartitions;
+private List taskDorisPartitions = 
Lists.newArrayList();
 
 public DorisSourceFunction(DorisStreamOptions streamOptions, 
DorisDeserializationSchema> deserializer) {
 this.deserializer = deserializer;
 this.options = streamOptions.getOptions();
 this.readOptions = streamOptions.getReadOptions();
+try {
+this.dorisPartitions = RestService.findPartitions(options, 
readOptions, logger);
+logger.info("Doris partitions size {}", dorisPartitions.size());
+} catch (DorisException e) {
+throw new RuntimeException("Failed fetch doris partitions");
+}
 }
 
 @Override
 public void open(Configuration parameters) throws Exception {
 super.open(parameters);
 this.isRunning = true;
-this.dorisPartitions = RestService.findPartitions(options, 
readOptions, logger);
+assignTaskPartitions();
+}
+
+/**
+ * Assign patitions to each task.
+ */
+private void assignTaskPartitions() {
+int taskIndex = getRuntimeContext().getIndexOfThisSubtask();
+int totalTasks = getRuntimeContext().getNumberOfParallelSubtasks();
+
+for (int i = 0; i < dorisPartitions.size(); i++) {
+if (i % totalTasks == taskIndex) {
+taskDorisPartitions.add(dorisPartitions.get(i));
+}
+}
+logger.info("subtask {} process {} partitions ", taskIndex, 
taskDorisPartitions.size());
 }
 
 @Override
 public void run(SourceContext> sourceContext) {
-for (PartitionDefinition partitions : dorisPartitions) {
+for (PartitionDefinition partitions : taskDorisPartitions) {
 try (ScalaValueReader scalaValueReader = new 
ScalaValueReader(partitions, options, readOptions)) {
 while (isR

[incubator-doris-flink-connector] 19/32: [chore][community](github) Remove travis and add github action (#7380)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 72a6afcb5e6a8612f8d5f7eea55291e85f606222
Author: Mingyu Chen 
AuthorDate: Wed Dec 15 13:27:37 2021 +0800

[chore][community](github) Remove travis and add github action (#7380)

1. Remove travis
2. Add github action to build extension:
1. docs
2. fs_broker
3. flink/spark/connector
---
 build.sh | 2 ++
 pom.xml  | 3 +--
 2 files changed, 3 insertions(+), 2 deletions(-)

diff --git a/build.sh b/build.sh
index 3be10a0..e66a654 100644
--- a/build.sh
+++ b/build.sh
@@ -30,6 +30,8 @@ ROOT=`cd "$ROOT"; pwd`
 
 export DORIS_HOME=${ROOT}/../../
 
+. ${DORIS_HOME}/env.sh
+
 # include custom environment variables
 if [[ -f ${DORIS_HOME}/custom_env.sh ]]; then
 . ${DORIS_HOME}/custom_env.sh
diff --git a/pom.xml b/pom.xml
index 3cdd0db..967d51d 100644
--- a/pom.xml
+++ b/pom.xml
@@ -37,8 +37,7 @@ under the License.
 3.3.0
 3.2.1
 UTF-8
-${basedir}/../../
-${basedir}/../../thirdparty
+${env.DORIS_THIRDPARTY}
 
 
 

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-flink-connector] 02/32: [Bug] Modify spark, flink doris connector to send request to FE, fix the problem of POST method, it should be the same as the method when sending the request (

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 3618f40e6346e44972b71991e72d2e55f89cc331
Author: jiafeng.zhang 
AuthorDate: Wed May 19 09:28:21 2021 +0800

[Bug] Modify spark, flink doris connector to send request to FE, fix the 
problem of POST method, it should be the same as the method when sending the 
request (#5788)

Modify spark, flink doris connector to send request to FE, fix the problem 
of POST method,
it should be the same as the method when sending the request
---
 .../org/apache/doris/flink/rest/RestService.java   | 25 +++---
 1 file changed, 12 insertions(+), 13 deletions(-)

diff --git a/src/main/java/org/apache/doris/flink/rest/RestService.java 
b/src/main/java/org/apache/doris/flink/rest/RestService.java
index 469f1aa..cd5b6d5 100644
--- a/src/main/java/org/apache/doris/flink/rest/RestService.java
+++ b/src/main/java/org/apache/doris/flink/rest/RestService.java
@@ -20,6 +20,7 @@ package org.apache.doris.flink.rest;
 import com.fasterxml.jackson.core.JsonParseException;
 import com.fasterxml.jackson.databind.JsonMappingException;
 import com.fasterxml.jackson.databind.ObjectMapper;
+
 import org.apache.commons.io.IOUtils;
 import org.apache.doris.flink.cfg.DorisOptions;
 import org.apache.doris.flink.cfg.DorisReadOptions;
@@ -42,7 +43,6 @@ import org.apache.http.client.methods.HttpPost;
 import org.apache.http.client.methods.HttpRequestBase;
 import org.apache.http.entity.StringEntity;
 
-
 import org.slf4j.Logger;
 
 import java.io.BufferedReader;
@@ -65,8 +65,6 @@ import java.util.Map;
 import java.util.Set;
 import java.util.stream.Collectors;
 
-
-
 import static 
org.apache.doris.flink.cfg.ConfigurationOptions.DORIS_TABLET_SIZE;
 import static 
org.apache.doris.flink.cfg.ConfigurationOptions.DORIS_TABLET_SIZE_DEFAULT;
 import static 
org.apache.doris.flink.cfg.ConfigurationOptions.DORIS_TABLET_SIZE_MIN;
@@ -110,10 +108,7 @@ public class RestService implements Serializable {
 .build();
 
 request.setConfig(requestConfig);
-
-
 logger.info("Send request to Doris FE '{}' with user '{}'.", 
request.getURI(), options.getUsername());
-
 IOException ex = null;
 int statusCode = -1;
 
@@ -121,17 +116,22 @@ public class RestService implements Serializable {
 logger.debug("Attempt {} to request {}.", attempt, 
request.getURI());
 try {
 String response;
-if(request instanceof HttpGet){
+if (request instanceof HttpGet){
 response = getConnectionGet(request.getURI().toString(), 
options.getUsername(), options.getPassword(),logger);
-}else{
-response = getConnection(request,  options.getUsername(), 
options.getPassword(),logger);
+} else {
+response = getConnectionPost(request,  
options.getUsername(), options.getPassword(),logger);
+}
+if (response == null) {
+logger.warn("Failed to get response from Doris FE {}, http 
code is {}",
+request.getURI(), statusCode);
+continue;
 }
 logger.trace("Success get response from Doris FE: {}, response 
is: {}.",
 request.getURI(), response);
 //Handle the problem of inconsistent data format returned by 
http v1 and v2
 ObjectMapper mapper = new ObjectMapper();
 Map map = mapper.readValue(response, Map.class);
-if(map.containsKey("code") && map.containsKey("msg")) {
+if (map.containsKey("code") && map.containsKey("msg")) {
 Object data = map.get("data");
 return mapper.writeValueAsString(data);
 } else {
@@ -147,14 +147,13 @@ public class RestService implements Serializable {
 throw new ConnectedFailedException(request.getURI().toString(), 
statusCode, ex);
 }
 
-private static String getConnection(HttpRequestBase request,String user, 
String passwd,Logger logger) throws IOException {
+private static String getConnectionPost(HttpRequestBase request,String 
user, String passwd,Logger logger) throws IOException {
 URL url = new URL(request.getURI().toString());
 HttpURLConnection conn = (HttpURLConnection) url.openConnection();
 conn.setInstanceFollowRedirects(false);
-conn.setRequestMethod("POST");
+conn.setRequestMethod(request.getMethod());
 String authEncoding = 
Base64.getEncoder().encodeToString(String.format("%s:%s", user, 
passwd).getBytes(StandardCharsets.UTF_8));
 conn.setRequestProperty("Authorization", "Basic " + authEncoding);
-
 InputStream content = ((HttpPost)request).getEntity().

[incubator-doris-flink-connector] 25/32: [refactor] update parent pom version and optimize build scripts (#7548)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 58a72719cc210ac0193c2111d3243258dc7c0343
Author: Zhengguo Yang 
AuthorDate: Wed Jan 5 10:45:11 2022 +0800

[refactor] update parent pom  version and optimize build scripts (#7548)
---
 build.sh | 38 +++---
 pom.xml  | 38 +++---
 2 files changed, 62 insertions(+), 14 deletions(-)

diff --git a/build.sh b/build.sh
index e66a654..d363dae 100644
--- a/build.sh
+++ b/build.sh
@@ -25,23 +25,49 @@
 
 set -eo pipefail
 
-ROOT=`dirname "$0"`
-ROOT=`cd "$ROOT"; pwd`
+usage() {
+  echo "
+  Usage:
+$0 flink_version scala_version
+  e.g.:
+$0 1.11.6 2.12
+$0 1.12.7 2.12
+$0 1.13.5 2.12
+  "
+  exit 1
+}
+
+if [ $# -ne 2 ]; then
+usage
+fi
+
+ROOT=$(dirname "$0")
+ROOT=$(
+cd "$ROOT"
+pwd
+)
 
 export DORIS_HOME=${ROOT}/../../
 
-. ${DORIS_HOME}/env.sh
+. "${DORIS_HOME}"/env.sh
 
 # include custom environment variables
 if [[ -f ${DORIS_HOME}/custom_env.sh ]]; then
-. ${DORIS_HOME}/custom_env.sh
+. "${DORIS_HOME}"/custom_env.sh
 fi
 
 # check maven
 MVN_CMD=mvn
-if [[ ! -z ${CUSTOM_MVN} ]]; then
+if [[ -n ${CUSTOM_MVN} ]]; then
 MVN_CMD=${CUSTOM_MVN}
 fi
+
+if [ -z "$1" ]; then
+export FLINK_VERSION="$1"
+fi
+if [ -z "$2" ]; then
+export SCALA_VERSION="$2"
+fi
 if ! ${MVN_CMD} --version; then
 echo "Error: mvn is not found"
 exit 1
@@ -50,7 +76,6 @@ export MVN_CMD
 rm -rf output/
 ${MVN_CMD} clean package
 
-
 mkdir -p output/
 cp target/doris-flink-*.jar ./output/
 
@@ -59,4 +84,3 @@ echo "Successfully build Flink-Doris-Connector"
 echo "*"
 
 exit 0
-
diff --git a/pom.xml b/pom.xml
index 3baae4b..a819733 100644
--- a/pom.xml
+++ b/pom.xml
@@ -19,18 +19,18 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-http://maven.apache.org/POM/4.0.0";
- xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
+http://www.w3.org/2001/XMLSchema-instance";
+ xmlns="http://maven.apache.org/POM/4.0.0";
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
 4.0.0
 
 org.apache
 apache
-18
+23
 
 org.apache.doris
 doris-flink-connector
-flink-${flink.version}-${scala.version}-SNAPSHOT
+${flink.version}-${scala.version}-1.0.0-SNAPSHOT
 Doris Flink Connector
 https://doris.apache.org/
 
@@ -67,8 +67,8 @@ under the License.
 
 
 
-2.12
-1.11.2
+${env.SCALA_VERSION}
+${env.FLINK_VERSION}
 0.13.0
 5.0.0
 3.8.1
@@ -113,6 +113,30 @@ under the License.
 
 
 
+
+fink-version
+
+true
+
+!env.FLINK_VERSION
+
+
+
+1.11.6
+
+
+
+scala-version
+
+true
+
+!env.SCALA_VERSION
+
+
+
+2.12
+
+
 
 
 
@@ -391,9 +415,9 @@ under the License.
 maven-javadoc-plugin
 ${maven-javadoc-plugin.version}
 
+true
 8
 false
-true
 
 
 

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-flink-connector] 24/32: [refactor] Standardize the writing of pom files, prepare for deployment to maven (#7477)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit f66a602c6044f6c299ad46a5a9841287fd426339
Author: Zhengguo Yang 
AuthorDate: Thu Dec 30 10:16:37 2021 +0800

[refactor] Standardize the writing of pom files, prepare for deployment to 
maven (#7477)
---
 pom.xml | 49 +
 1 file changed, 45 insertions(+), 4 deletions(-)

diff --git a/pom.xml b/pom.xml
index 967d51d..3baae4b 100644
--- a/pom.xml
+++ b/pom.xml
@@ -23,11 +23,49 @@ under the License.
  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
 4.0.0
+
+org.apache
+apache
+18
+
+org.apache.doris
+doris-flink-connector
+flink-${flink.version}-${scala.version}-SNAPSHOT
+Doris Flink Connector
+https://doris.apache.org/
+
+
+Apache 2.0 License
+https://www.apache.org/licenses/LICENSE-2.0.html
+repo
+
+
+
+
scm:git:g...@github.com:apache/incubator-doris.git
+
scm:git:g...@github.com:apache/incubator-doris.git
+scm:git:g...@github.com:apache/incubator-doris.git
+HEAD
+
+
+GitHub
+https://github.com/apache/incubator-doris/issues
+
 
-org.apache
-doris-flink
-1.0.0-flink-${flink.version}_${scala.version}
+
+
+Dev Mailing List
+d...@doris.apache.org
+dev-subscr...@doris.apache.org
+dev-unsubscr...@doris.apache.org
+
 
+
+Commits Mailing List
+commits@doris.apache.org
+commits-subscr...@doris.apache.org
+commits-unsubscr...@doris.apache.org
+
+
 
 2.12
 1.11.2
@@ -306,13 +344,15 @@ under the License.
 
 
 
+
 
 org.apache.maven.plugins
 maven-compiler-plugin

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-flink-connector] 09/32: [Flink] Fix bug of flink doris connector (#6655)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 001b52c082944da2870c83bc61fa8c45c325cf0c
Author: xhmz 
AuthorDate: Fri Sep 24 21:38:35 2021 +0800

[Flink] Fix bug of flink doris connector (#6655)

Flink-Doris-Connector do not support flink 1.13, refactor doris sink forma
to not use GenericRowData. But to use RowData::FieldGetter.
---
 .../flink/table/DorisDynamicOutputFormat.java  |  45 --
 .../flink/table/DorisDynamicTableFactory.java  | 165 ++---
 .../doris/flink/table/DorisDynamicTableSink.java   |  28 ++--
 3 files changed, 127 insertions(+), 111 deletions(-)

diff --git 
a/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java 
b/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
index 77b53ba..6ee8834 100644
--- a/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
+++ b/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
@@ -25,22 +25,24 @@ import org.apache.doris.flink.rest.RestService;
 import org.apache.flink.api.common.io.RichOutputFormat;
 import org.apache.flink.configuration.Configuration;
 import org.apache.flink.runtime.util.ExecutorThreadFactory;
-import org.apache.flink.table.data.GenericRowData;
 import org.apache.flink.table.data.RowData;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.logical.LogicalType;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
 import java.io.IOException;
-import java.sql.SQLException;
 import java.util.ArrayList;
+import java.util.Arrays;
 import java.util.List;
-import java.util.Properties;
 import java.util.StringJoiner;
 import java.util.concurrent.Executors;
 import java.util.concurrent.ScheduledExecutorService;
 import java.util.concurrent.ScheduledFuture;
 import java.util.concurrent.TimeUnit;
 
+import static org.apache.flink.table.data.RowData.createFieldGetter;
+
 
 /**
  * DorisDynamicOutputFormat
@@ -69,12 +71,18 @@ public class DorisDynamicOutputFormat extends 
RichOutputFormat {
 private transient ScheduledFuture scheduledFuture;
 private transient volatile Exception flushException;
 
-public DorisDynamicOutputFormat(DorisOptions option, DorisReadOptions 
readOptions, DorisExecutionOptions executionOptions) {
+private final RowData.FieldGetter[] fieldGetters;
+
+public DorisDynamicOutputFormat(DorisOptions option, DorisReadOptions 
readOptions, DorisExecutionOptions executionOptions, LogicalType[] 
logicalTypes) {
 this.options = option;
 this.readOptions = readOptions;
 this.executionOptions = executionOptions;
 this.fieldDelimiter = 
executionOptions.getStreamLoadProp().getProperty(FIELD_DELIMITER_KEY, 
FIELD_DELIMITER_DEFAULT);
 this.lineDelimiter = 
executionOptions.getStreamLoadProp().getProperty(LINE_DELIMITER_KEY, 
LINE_DELIMITER_DEFAULT);
+this.fieldGetters = new RowData.FieldGetter[logicalTypes.length];
+for (int i = 0; i < logicalTypes.length; i++) {
+fieldGetters[i] = createFieldGetter(logicalTypes[i], i);
+}
 }
 
 @Override
@@ -84,12 +92,12 @@ public class DorisDynamicOutputFormat extends 
RichOutputFormat {
 @Override
 public void open(int taskNumber, int numTasks) throws IOException {
 dorisStreamLoad = new DorisStreamLoad(
-getBackend(),
-options.getTableIdentifier().split("\\.")[0],
-options.getTableIdentifier().split("\\.")[1],
-options.getUsername(),
-options.getPassword(),
-executionOptions.getStreamLoadProp());
+getBackend(),
+options.getTableIdentifier().split("\\.")[0],
+options.getTableIdentifier().split("\\.")[1],
+options.getUsername(),
+options.getPassword(),
+executionOptions.getStreamLoadProp());
 LOG.info("Streamload BE:{}", dorisStreamLoad.getLoadUrlStr());
 
 if (executionOptions.getBatchIntervalMs() != 0 && 
executionOptions.getBatchSize() != 1) {
@@ -126,9 +134,8 @@ public class DorisDynamicOutputFormat extends 
RichOutputFormat {
 
 private void addBatch(RowData row) {
 StringJoiner value = new StringJoiner(this.fieldDelimiter);
-GenericRowData rowData = (GenericRowData) row;
-for (int i = 0; i < row.getArity(); ++i) {
-Object field = rowData.getField(i);
+for (int i = 0; i < row.getArity() && i < fieldGetters.length; ++i) {
+Object field = fieldGetters[i].getFieldOrNull(row);
 if (field != null) {
 value.add(field.toString());
 } else {
@@ -213,6 +220,7 @@ public class DorisDynamicOutputFormat extends 
RichOutputFormat {
 private DorisOptions.Builder optionsBuilder;
 priv

[incubator-doris-flink-connector] 04/32: [FlinkConnector] Support time interval for flink connector (#5934)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 31f359c7971f0794b4ac7671164c11bc6231d8d5
Author: wudi <676366...@qq.com>
AuthorDate: Wed Jun 30 09:27:12 2021 +0800

[FlinkConnector] Support time interval for flink connector (#5934)
---
 .../doris/flink/cfg/DorisExecutionOptions.java | 20 ++-
 .../flink/table/DorisDynamicOutputFormat.java  | 64 ++
 .../flink/table/DorisDynamicTableFactory.java  | 10 
 .../apache/doris/flink/table/DorisStreamLoad.java  |  2 +-
 4 files changed, 82 insertions(+), 14 deletions(-)

diff --git 
a/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java 
b/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java
index ee8b09e..330cbc9 100644
--- a/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java
+++ b/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java
@@ -17,7 +17,10 @@
 package org.apache.doris.flink.cfg;
 
 
+import org.apache.flink.util.Preconditions;
+
 import java.io.Serializable;
+import java.time.Duration;
 
 /**
  * JDBC sink batch options.
@@ -27,10 +30,13 @@ public class DorisExecutionOptions  implements Serializable 
{
 
 private final Integer batchSize;
 private final Integer maxRetries;
+private final Long batchIntervalMs;
 
-public DorisExecutionOptions(Integer batchSize, Integer maxRetries) {
+public DorisExecutionOptions(Integer batchSize, Integer maxRetries,Long 
batchIntervalMs) {
+Preconditions.checkArgument(maxRetries >= 0);
 this.batchSize = batchSize;
 this.maxRetries = maxRetries;
+this.batchIntervalMs = batchIntervalMs;
 }
 
 public Integer getBatchSize() {
@@ -41,6 +47,10 @@ public class DorisExecutionOptions  implements Serializable {
 return maxRetries;
 }
 
+public Long getBatchIntervalMs() {
+return batchIntervalMs;
+}
+
 public static Builder builder() {
 return new Builder();
 }
@@ -51,6 +61,7 @@ public class DorisExecutionOptions  implements Serializable {
 public static class Builder {
 private Integer batchSize;
 private Integer maxRetries;
+private Long batchIntervalMs;
 
 public Builder setBatchSize(Integer batchSize) {
 this.batchSize = batchSize;
@@ -62,8 +73,13 @@ public class DorisExecutionOptions  implements Serializable {
 return this;
 }
 
+public Builder setBatchIntervalMs(Long batchIntervalMs) {
+this.batchIntervalMs = batchIntervalMs;
+return this;
+}
+
 public DorisExecutionOptions build() {
-return new DorisExecutionOptions(batchSize,maxRetries);
+return new 
DorisExecutionOptions(batchSize,maxRetries,batchIntervalMs);
 }
 }
 
diff --git 
a/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java 
b/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
index 44880b5..4b2f5fe 100644
--- a/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
+++ b/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
@@ -24,15 +24,21 @@ import org.apache.doris.flink.exception.StreamLoadException;
 import org.apache.doris.flink.rest.RestService;
 import org.apache.flink.api.common.io.RichOutputFormat;
 import org.apache.flink.configuration.Configuration;
+import org.apache.flink.runtime.util.ExecutorThreadFactory;
 import org.apache.flink.table.data.GenericRowData;
 import org.apache.flink.table.data.RowData;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
 import java.io.IOException;
+import java.sql.SQLException;
 import java.util.ArrayList;
 import java.util.List;
 import java.util.StringJoiner;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
 
 
 /**
@@ -51,6 +57,10 @@ public class DorisDynamicOutputFormat extends 
RichOutputFormat  {
 private final List batch = new ArrayList<>();
 private transient volatile boolean closed = false;
 
+private transient ScheduledExecutorService scheduler;
+private transient ScheduledFuture scheduledFuture;
+private transient volatile Exception flushException;
+
 public DorisDynamicOutputFormat(DorisOptions option,DorisReadOptions 
readOptions,DorisExecutionOptions executionOptions) {
 this.options = option;
 this.readOptions = readOptions;
@@ -71,10 +81,33 @@ public class DorisDynamicOutputFormat extends 
RichOutputFormat  {
 options.getUsername(),
 options.getPassword());
 LOG.info("Streamload BE:{}",dorisStreamLoad.getLoadUrlStr());
+
+if (executionOptions.getBatchIntervalMs() != 0 && 
executionOptions.ge

[incubator-doris-flink-connector] 28/32: [Feature][flink-connector] support flink delete option (#7457)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit 021e281e0e283e7c0c862ceaf878cc6b7e83de5b
Author: wudi <676366...@qq.com>
AuthorDate: Sun Jan 23 20:24:41 2022 +0800

[Feature][flink-connector] support flink  delete option (#7457)

* Flink Connector supports delete option on Unique models
Co-authored-by: wudi 
---
 .../doris/flink/cfg/DorisExecutionOptions.java |  38 +++--
 .../apache/doris/flink/cfg/DorisStreamOptions.java |  22 +--
 .../org/apache/doris/flink/rest/models/Schema.java |   9 ++
 .../flink/table/DorisDynamicOutputFormat.java  | 101 +---
 .../flink/table/DorisDynamicTableFactory.java  | 176 ++---
 .../doris/flink/table/DorisDynamicTableSink.java   |  24 +--
 6 files changed, 218 insertions(+), 152 deletions(-)

diff --git 
a/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java 
b/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java
index 47bb517..ad1ab07 100644
--- a/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java
+++ b/src/main/java/org/apache/doris/flink/cfg/DorisExecutionOptions.java
@@ -26,8 +26,8 @@ import java.util.Properties;
  * JDBC sink batch options.
  */
 public class DorisExecutionOptions implements Serializable {
-private static final long serialVersionUID = 1L;
 
+private static final long serialVersionUID = 1L;
 public static final Integer DEFAULT_BATCH_SIZE = 1;
 public static final Integer DEFAULT_MAX_RETRY_TIMES = 1;
 private static final Long DEFAULT_INTERVAL_MILLIS = 1L;
@@ -41,12 +41,27 @@ public class DorisExecutionOptions implements Serializable {
  */
 private final Properties streamLoadProp;
 
-public DorisExecutionOptions(Integer batchSize, Integer maxRetries, Long 
batchIntervalMs, Properties streamLoadProp) {
+private final Boolean enableDelete;
+
+
+public DorisExecutionOptions(Integer batchSize, Integer maxRetries, Long 
batchIntervalMs, Properties streamLoadProp, Boolean enableDelete) {
 Preconditions.checkArgument(maxRetries >= 0);
 this.batchSize = batchSize;
 this.maxRetries = maxRetries;
 this.batchIntervalMs = batchIntervalMs;
 this.streamLoadProp = streamLoadProp;
+this.enableDelete = enableDelete;
+}
+
+public static Builder builder() {
+return new Builder();
+}
+
+public static DorisExecutionOptions defaults() {
+Properties pro = new Properties();
+pro.setProperty("format", "json");
+pro.setProperty("strip_outer_array", "true");
+return new Builder().setStreamLoadProp(pro).build();
 }
 
 public Integer getBatchSize() {
@@ -65,15 +80,8 @@ public class DorisExecutionOptions implements Serializable {
 return streamLoadProp;
 }
 
-public static Builder builder() {
-return new Builder();
-}
-
-public static DorisExecutionOptions defaults() {
-Properties pro = new Properties();
-pro.setProperty("format", "json");
-pro.setProperty("strip_outer_array", "true");
-return new Builder().setStreamLoadProp(pro).build();
+public Boolean getEnableDelete() {
+return enableDelete;
 }
 
 /**
@@ -84,6 +92,7 @@ public class DorisExecutionOptions implements Serializable {
 private Integer maxRetries = DEFAULT_MAX_RETRY_TIMES;
 private Long batchIntervalMs = DEFAULT_INTERVAL_MILLIS;
 private Properties streamLoadProp = new Properties();
+private Boolean enableDelete = false;
 
 public Builder setBatchSize(Integer batchSize) {
 this.batchSize = batchSize;
@@ -105,8 +114,13 @@ public class DorisExecutionOptions implements Serializable 
{
 return this;
 }
 
+public Builder setEnableDelete(Boolean enableDelete) {
+this.enableDelete = enableDelete;
+return this;
+}
+
 public DorisExecutionOptions build() {
-return new DorisExecutionOptions(batchSize, maxRetries, 
batchIntervalMs, streamLoadProp);
+return new DorisExecutionOptions(batchSize, maxRetries, 
batchIntervalMs, streamLoadProp, enableDelete);
 }
 }
 
diff --git a/src/main/java/org/apache/doris/flink/cfg/DorisStreamOptions.java 
b/src/main/java/org/apache/doris/flink/cfg/DorisStreamOptions.java
index 016b29a..c5c2c16 100644
--- a/src/main/java/org/apache/doris/flink/cfg/DorisStreamOptions.java
+++ b/src/main/java/org/apache/doris/flink/cfg/DorisStreamOptions.java
@@ -32,8 +32,8 @@ public class DorisStreamOptions implements Serializable {
 private DorisReadOptions readOptions;
 
 public DorisStreamOptions(Properties prop) {
- this.prop = prop;
- init();
+this.prop = prop;
+init();
 }
 
 /**
@@ -47,17 +47,17 @@ public class DorisSt

[incubator-doris-flink-connector] 05/32: [Bug][Flink] Fix when data null , flink-connector throw NullPointerException (#6165)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit aa4013d03ca8ce014c14b2fc9da3d92ae6a769ba
Author: wudi <676366...@qq.com>
AuthorDate: Thu Jul 8 09:55:50 2021 +0800

[Bug][Flink] Fix when data null , flink-connector throw 
NullPointerException (#6165)
---
 .../org/apache/doris/flink/table/DorisDynamicOutputFormat.java| 8 +++-
 1 file changed, 7 insertions(+), 1 deletion(-)

diff --git 
a/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java 
b/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
index 4b2f5fe..33f5c85 100644
--- a/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
+++ b/src/main/java/org/apache/doris/flink/table/DorisDynamicOutputFormat.java
@@ -54,6 +54,7 @@ public class DorisDynamicOutputFormat extends 
RichOutputFormat  {
 private DorisStreamLoad dorisStreamLoad;
 private final String fieldDelimiter = "\t";
 private final String lineDelimiter = "\n";
+private final String NULL_VALUE = "\\N";
 private final List batch = new ArrayList<>();
 private transient volatile boolean closed = false;
 
@@ -118,7 +119,12 @@ public class DorisDynamicOutputFormat extends 
RichOutputFormat  {
 StringJoiner value = new StringJoiner(this.fieldDelimiter);
 GenericRowData rowData = (GenericRowData) row;
 for(int i = 0; i < row.getArity(); ++i) {
-value.add(rowData.getField(i).toString());
+Object field = rowData.getField(i);
+if(field != null){
+value.add(field.toString());
+}else{
+value.add(this.NULL_VALUE);
+}
 }
 batch.add(value.toString());
 }

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-flink-connector] 07/32: [Doc] flink/spark connector: add sources/javadoc plugins (#6435)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git

commit cea6391410eddf33ffa00d5b51ddd331a619b518
Author: wunan1210 
AuthorDate: Mon Aug 16 22:41:24 2021 +0800

[Doc] flink/spark connector: add sources/javadoc plugins (#6435)

spark-doris-connector/flink-doris-connect add plugins to generate javadoc 
and sources jar,
so can be easy to distribute and debug.
---
 build.sh |  2 ++
 pom.xml  | 39 ++-
 2 files changed, 40 insertions(+), 1 deletion(-)

diff --git a/build.sh b/build.sh
index d3e7ca3..70f4e96 100644
--- a/build.sh
+++ b/build.sh
@@ -51,6 +51,8 @@ ${MVN_CMD} clean package
 
 mkdir -p output/
 cp target/doris-flink-1.0-SNAPSHOT.jar ./output/
+cp target/doris-flink-1.0-SNAPSHOT-javadoc.jar ./output/
+cp target/doris-flink-1.0-SNAPSHOT-sources.jar ./output/
 
 echo "*"
 echo "Successfully build Flink-Doris-Connector"
diff --git a/pom.xml b/pom.xml
index fad5bf9..bd43778 100644
--- a/pom.xml
+++ b/pom.xml
@@ -13,6 +13,9 @@
 1.11.2
 0.9.3
 0.15.1
+3.8.1
+3.3.0
+3.2.1
 UTF-8
 ${basedir}/../../
 ${basedir}/../../thirdparty
@@ -295,12 +298,46 @@
 
 org.apache.maven.plugins
 maven-compiler-plugin
-3.8.1
+${maven-compiler-plugin.version}
 
 8
 8
 
 
+
+org.apache.maven.plugins
+maven-javadoc-plugin
+${maven-javadoc-plugin.version}
+
+8
+false
+true
+
+
+
+attach-javadocs
+
+jar
+
+
+
+
+
+org.apache.maven.plugins
+maven-source-plugin
+${maven-source-plugin.version}
+
+true
+
+
+
+compile
+
+jar
+
+
+
+
 
 
 

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris-flink-connector] branch master updated: [init] add .asf.yaml

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository 
https://gitbox.apache.org/repos/asf/incubator-doris-flink-connector.git


The following commit(s) were added to refs/heads/master by this push:
 new fc0b9bb  [init] add .asf.yaml
fc0b9bb is described below

commit fc0b9bb144f2e6be0a7e57f96c4422a9586cf99f
Author: morningman 
AuthorDate: Fri Feb 11 23:37:33 2022 +0800

[init] add .asf.yaml
---
 .asf.yaml | 39 +++
 1 file changed, 39 insertions(+)

diff --git a/.asf.yaml b/.asf.yaml
new file mode 100644
index 000..071d8b9
--- /dev/null
+++ b/.asf.yaml
@@ -0,0 +1,39 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+github:
+  description: Flink Connector for Apache Doris(incubating)
+  homepage: https://doris.apache.org/
+  labels:
+- data-warehousing
+- mpp
+- olap
+- dbms
+- apache
+- doris
+- flink
+  enabled_merge_buttons:
+squash:  true
+merge:   false
+rebase:  false
+  protected_branches:
+master:
+  required_pull_request_reviews:
+dismiss_stale_reviews: true
+required_approving_review_count: 1
+  notifications:
+pullrequests_status:  commits@doris.apache.org

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] github-actions[bot] commented on pull request #7961: [fix](demo) scala.Function1 used in java about compiling error:apply$mcVJ$sp(long)

2022-02-11 Thread GitBox


github-actions[bot] commented on pull request #7961:
URL: https://github.com/apache/incubator-doris/pull/7961#issuecomment-1036947018






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] morningman commented on a change in pull request #8000: [fix](compatibility) Fix compatibility issue of PRowBatch and some tablet sink bugs

2022-02-11 Thread GitBox


morningman commented on a change in pull request #8000:
URL: https://github.com/apache/incubator-doris/pull/8000#discussion_r805103042



##
File path: be/src/exec/tablet_sink.cpp
##
@@ -571,7 +577,7 @@ void IndexChannel::add_row(Tuple* tuple, int64_t tablet_id) 
{
 // if this node channel is already failed, this add_row will be skipped
 auto st = channel->add_row(tuple, tablet_id);
 if (!st.ok()) {
-mark_as_failed(channel, st.get_error_msg(), tablet_id);
+mark_as_failed(channel.get(), st.get_error_msg(), tablet_id);

Review comment:
   OK, I will change it




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] morningman removed a comment on pull request #8021: [Bug] Fix segmentation fault at unalign address cast to int128

2022-02-11 Thread GitBox


morningman removed a comment on pull request #8021:
URL: https://github.com/apache/incubator-doris/pull/8021#issuecomment-1036948889


   Please add the stack info or error msg in issue #8020 , for easy searching.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] morningman commented on pull request #8021: [Bug] Fix segmentation fault at unalign address cast to int128

2022-02-11 Thread GitBox


morningman commented on pull request #8021:
URL: https://github.com/apache/incubator-doris/pull/8021#issuecomment-1036948840






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] github-actions[bot] commented on pull request #8000: [fix](compatibility) Fix compatibility issue of PRowBatch and some tablet sink bugs

2022-02-11 Thread GitBox


github-actions[bot] commented on pull request #8000:
URL: https://github.com/apache/incubator-doris/pull/8000#issuecomment-1036950335






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] caiconghui opened a new issue #8028: [Feature] Support data distributed by rand() to escape data sk

2022-02-11 Thread GitBox


caiconghui opened a new issue #8028:
URL: https://github.com/apache/incubator-doris/issues/8028


   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/incubator-doris/issues?q=is%3Aissue) and 
found no similar issues.
   
   
   ### Description
   
   In some scenarios, users cannot find a suitable hash key to avoid data skew, 
so we need to provide an additional data distribution method to avoid data skew
   
   ### Use case
   
   CREATE TABLE table1
   (
   siteid INT DEFAULT '10',
   citycode SMALLINT,
   username VARCHAR(32) DEFAULT '',
   pv BIGINT SUM DEFAULT '0'
   )
   AGGREGATE KEY(siteid, citycode, username)
   DISTRIBUTED BY rand() BUCKETS 10
   PROPERTIES("replication_num" = "1");
   
   ### Related issues
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] HappenLee merged pull request #8008: [Refactor] remove plugin folder in be since it is useless

2022-02-11 Thread GitBox


HappenLee merged pull request #8008:
URL: https://github.com/apache/incubator-doris/pull/8008


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris] branch master updated: [Refactor] remove plugin folder in be since it is useless and it need fPIC tag to build and we will remove all fPIC tag in the future (#8008)

2022-02-11 Thread lihaopeng
This is an automated email from the ASF dual-hosted git repository.

lihaopeng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-doris.git


The following commit(s) were added to refs/heads/master by this push:
 new 6b9cb49  [Refactor] remove plugin folder in be since it is useless and 
it need fPIC tag to build and we will remove all fPIC tag in the future (#8008)
6b9cb49 is described below

commit 6b9cb49779940f3daf68ceadbdced1d1efa159cb
Author: yiguolei 
AuthorDate: Sat Feb 12 12:28:14 2022 +0800

[Refactor] remove plugin folder in be since it is useless and it need fPIC 
tag to build and we will remove all fPIC tag in the future (#8008)
---
 be/CMakeLists.txt|  22 -
 be/src/common/config.h   |   2 -
 be/src/plugin/CMakeLists.txt |  28 --
 be/src/plugin/plugin.h   |  82 
 be/src/plugin/plugin_loader.cpp  | 197 ---
 be/src/plugin/plugin_loader.h| 108 -
 be/src/plugin/plugin_mgr.cpp | 165 
 be/src/plugin/plugin_mgr.h   |  64 -
 be/src/plugin/plugin_zip.cpp | 132 --
 be/src/plugin/plugin_zip.h   |  43 -
 be/src/runtime/exec_env.h|   4 -
 be/src/runtime/exec_env_init.cpp |   2 -
 12 files changed, 849 deletions(-)

diff --git a/be/CMakeLists.txt b/be/CMakeLists.txt
index 2d50c67..377ddd6 100644
--- a/be/CMakeLists.txt
+++ b/be/CMakeLists.txt
@@ -586,7 +586,6 @@ set(DORIS_LINK_LIBS
 Webserver
 Geo
 Vec
-Plugin
 ${WL_END_GROUP}
 )
 if (${MAKE_TEST} STREQUAL "ON")
@@ -775,7 +774,6 @@ if (BUILD_META_TOOL AND BUILD_META_TOOL STREQUAL "ON")
 endif()
 
 add_subdirectory(${SRC_DIR}/util)
-add_subdirectory(${SRC_DIR}/plugin)
 add_subdirectory(${SRC_DIR}/vec)
 
 # Utility CMake function to make specifying tests and benchmarks less verbose
@@ -797,24 +795,6 @@ FUNCTION(ADD_BE_TEST TEST_NAME)
 ADD_TEST(${TEST_FILE_NAME} "${BUILD_OUTPUT_ROOT_DIRECTORY}/${TEST_NAME}")
 ENDFUNCTION()
 
-FUNCTION(ADD_BE_PLUGIN PLUGIN_NAME)
-set(BUILD_OUTPUT_ROOT_DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}/")
-
-get_filename_component(DIR_NAME ${CMAKE_CURRENT_SOURCE_DIR} NAME)
-get_filename_component(PLUGIN_DIR_NAME ${PLUGIN_NAME} PATH)
-get_filename_component(PLUGIN_FILE_NAME ${PLUGIN_NAME} NAME)
-
-ADD_LIBRARY(${PLUGIN_FILE_NAME} SHARED ${PLUGIN_NAME}.cpp)
-
-TARGET_LINK_LIBRARIES(${PLUGIN_FILE_NAME} ${DORIS_LINK_LIBS})
-SET_TARGET_PROPERTIES(${PLUGIN_FILE_NAME} PROPERTIES COMPILE_FLAGS 
"-fno-access-control")
-
-if (NOT "${PLUGIN_DIR_NAME}" STREQUAL "")
-SET_TARGET_PROPERTIES(${PLUGIN_FILE_NAME} PROPERTIES 
RUNTIME_OUTPUT_DIRECTORY "${BUILD_OUTPUT_ROOT_DIRECTORY}/${PLUGIN_DIR_NAME}")
-endif ()
-
-ENDFUNCTION()
-
 if (${MAKE_TEST} STREQUAL "ON")
 add_subdirectory(${TEST_DIR}/test_util)
 add_subdirectory(${TEST_DIR}/agent)
@@ -835,8 +815,6 @@ if (${MAKE_TEST} STREQUAL "ON")
 add_subdirectory(${TEST_DIR}/vec/function)
 add_subdirectory(${TEST_DIR}/vec/runtime)
 add_subdirectory(${TEST_DIR}/vec/aggregate_functions)
-add_subdirectory(${TEST_DIR}/plugin)
-add_subdirectory(${TEST_DIR}/plugin/example)
 add_subdirectory(${TEST_DIR}/tools)
 endif ()
 
diff --git a/be/src/common/config.h b/be/src/common/config.h
index c75d434..cc34489 100644
--- a/be/src/common/config.h
+++ b/be/src/common/config.h
@@ -544,8 +544,6 @@ CONF_mInt64(max_runnings_transactions_per_txn_map, "100");
 // this is a an enhancement for better performance to manage tablet
 CONF_Int32(tablet_map_shard_size, "1");
 
-CONF_String(plugin_path, "${DORIS_HOME}/plugin");
-
 // txn_map_lock shard size, the value is 2^n, n=0,1,2,3,4
 // this is a an enhancement for better performance to manage txn
 CONF_Int32(txn_map_shard_size, "128");
diff --git a/be/src/plugin/CMakeLists.txt b/be/src/plugin/CMakeLists.txt
deleted file mode 100644
index ecc5dac..000
--- a/be/src/plugin/CMakeLists.txt
+++ /dev/null
@@ -1,28 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# where to put generated libraries
-set(LIBRARY_OUTPUT_PATH "${BUILD_DIR}/src/plugin")
-
-# where to put generated binar

[GitHub] [incubator-doris] yangzhg merged pull request #8022: [refactor] remove some unused IR code

2022-02-11 Thread GitBox


yangzhg merged pull request #8022:
URL: https://github.com/apache/incubator-doris/pull/8022


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris] branch master updated: [fix] (grouping set) fix Unexpected exception: bitIndex < 0: -1 (#7989)

2022-02-11 Thread yangzhg
This is an automated email from the ASF dual-hosted git repository.

yangzhg pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-doris.git


The following commit(s) were added to refs/heads/master by this push:
 new ee26cd2  [fix] (grouping set) fix Unexpected exception: bitIndex < 0: 
-1 (#7989)
ee26cd2 is described below

commit ee26cd2d078c782ac2d7a086d81bbe3c2e3ab1f1
Author: Zhengguo Yang 
AuthorDate: Sat Feb 12 15:18:08 2022 +0800

[fix] (grouping set) fix Unexpected exception: bitIndex < 0: -1 (#7989)
---
 .../org/apache/doris/analysis/GroupingInfo.java| 10 ++---
 .../java/org/apache/doris/analysis/SelectStmt.java |  3 +--
 .../apache/doris/analysis/GroupByClauseTest.java   | 18 +++
 .../org/apache/doris/planner/QueryPlanTest.java| 26 ++
 4 files changed, 43 insertions(+), 14 deletions(-)

diff --git 
a/fe/fe-core/src/main/java/org/apache/doris/analysis/GroupingInfo.java 
b/fe/fe-core/src/main/java/org/apache/doris/analysis/GroupingInfo.java
index e1e7961..41f544d 100644
--- a/fe/fe-core/src/main/java/org/apache/doris/analysis/GroupingInfo.java
+++ b/fe/fe-core/src/main/java/org/apache/doris/analysis/GroupingInfo.java
@@ -33,7 +33,7 @@ import java.util.stream.Collectors;
 
 public class GroupingInfo {
 public static final String COL_GROUPING_ID = "GROUPING_ID";
-
+private GroupByClause groupByClause;
 private VirtualSlotRef groupingIDSlot;
 private TupleDescriptor virtualTuple;
 private Set groupingSlots;
@@ -41,8 +41,9 @@ public class GroupingInfo {
 private GroupByClause.GroupingType groupingType;
 private BitSet bitSetAll;
 
-public GroupingInfo(Analyzer analyzer, GroupByClause.GroupingType 
groupingType) throws AnalysisException {
-this.groupingType = groupingType;
+public GroupingInfo(Analyzer analyzer, GroupByClause groupByClause) throws 
AnalysisException {
+this.groupByClause = groupByClause;
+this.groupingType = groupByClause.getGroupingType();
 groupingSlots = new LinkedHashSet<>();
 virtualTuple = 
analyzer.getDescTbl().createTupleDescriptor("VIRTUAL_TUPLE");
 groupingIDSlot = new VirtualSlotRef(COL_GROUPING_ID, Type.BIGINT, 
virtualTuple, new ArrayList<>());
@@ -190,6 +191,9 @@ public class GroupingInfo {
 if (colIndex != -1 && 
!(ref.getViewStmt().getResultExprs().get(colIndex) instanceof SlotRef)) {
 throw new AnalysisException("grouping functions only 
support column in current version.");
 }
+} else if (!groupByClause.getGroupingExprs().contains(child)) {
+throw new AnalysisException("select list expression not 
produced by aggregation output" +
+" (missing from GROUP BY clause?): " + ((SlotRef) 
child).getColumnName());
 }
 }
 // if is substituted skip
diff --git a/fe/fe-core/src/main/java/org/apache/doris/analysis/SelectStmt.java 
b/fe/fe-core/src/main/java/org/apache/doris/analysis/SelectStmt.java
index 2b1edb2..502a7fd 100644
--- a/fe/fe-core/src/main/java/org/apache/doris/analysis/SelectStmt.java
+++ b/fe/fe-core/src/main/java/org/apache/doris/analysis/SelectStmt.java
@@ -447,7 +447,7 @@ public class SelectStmt extends QueryStmt {
 }
 }
 }
-groupingInfo = new GroupingInfo(analyzer, 
groupByClause.getGroupingType());
+groupingInfo = new GroupingInfo(analyzer, groupByClause);
 groupingInfo.substituteGroupingFn(resultExprs, analyzer);
 } else {
 for (Expr expr : resultExprs) {
@@ -1881,4 +1881,3 @@ public class SelectStmt extends QueryStmt {
 return this.id.equals(((SelectStmt) obj).id);
 }
 }
-
diff --git 
a/fe/fe-core/src/test/java/org/apache/doris/analysis/GroupByClauseTest.java 
b/fe/fe-core/src/test/java/org/apache/doris/analysis/GroupByClauseTest.java
index e105863..67f2dc5 100644
--- a/fe/fe-core/src/test/java/org/apache/doris/analysis/GroupByClauseTest.java
+++ b/fe/fe-core/src/test/java/org/apache/doris/analysis/GroupByClauseTest.java
@@ -83,10 +83,10 @@ public class GroupByClauseTest {
 GroupByClause.GroupingType.GROUPING_SETS);
 GroupingInfo groupingInfo = null;
 try {
-groupingInfo = new GroupingInfo(analyzer, 
GroupByClause.GroupingType.GROUPING_SETS);
 groupByClause.genGroupingExprs();
-groupingInfo.buildRepeat(groupByClause.getGroupingExprs(), 
groupByClause.getGroupingSetList());
 groupByClause.analyze(analyzer);
+groupingInfo = new GroupingInfo(analyzer, groupByClause);
+groupingInfo.buildRepeat(groupByClause.getGroupingExprs(), 
groupByClause.getGroupingSetList());
 } catch (AnalysisException exception) {
 exception.printStackTrace();
 Assert.assertTrue(false);
@@ -124,10 +124,10

[GitHub] [incubator-doris] yangzhg merged pull request #7989: [fix] (grouping set) fix Unexpected exception: bitIndex < 0: -1

2022-02-11 Thread GitBox


yangzhg merged pull request #7989:
URL: https://github.com/apache/incubator-doris/pull/7989


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] yangzhg closed issue #7971: [Bug] (grouping set) Unexpected exception: bitIndex < 0: -1

2022-02-11 Thread GitBox


yangzhg closed issue #7971:
URL: https://github.com/apache/incubator-doris/issues/7971


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] morningman merged pull request #7961: [fix](demo) scala.Function1 used in java about compiling error:apply$mcVJ$sp(long)

2022-02-11 Thread GitBox


morningman merged pull request #7961:
URL: https://github.com/apache/incubator-doris/pull/7961


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[GitHub] [incubator-doris] morningman closed issue #7959: [Bug] In spark demo module, scala.Function1 used in java about compiling error:apply$mcVJ$sp(…

2022-02-11 Thread GitBox


morningman closed issue #7959:
URL: https://github.com/apache/incubator-doris/issues/7959


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org



[incubator-doris] branch master updated: [fix](demo) scala.Function1 used in java about compiling error:apply$mcVJ$sp(long)

2022-02-11 Thread morningman
This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-doris.git


The following commit(s) were added to refs/heads/master by this push:
 new 6934534  [fix](demo) scala.Function1 used in java about compiling 
error:apply$mcVJ$sp(long)
6934534 is described below

commit 6934534155411fdc54d9690d41cdacc1bd0d293e
Author: Hengdong Gong 
AuthorDate: Sat Feb 12 15:59:41 2022 +0800

[fix](demo) scala.Function1 used in java about compiling 
error:apply$mcVJ$sp(long)

Fix MyForeachPartitionFunction.java:xxx:xxx
java: org.apache.doris.demo.spark.demo.hdfs.MyForeachPartitionFunction is 
not abstract and does not override abstract method apply$mcVJ$sp(long) in 
scala.Function1
---
 .../doris/demo/spark/demo/hdfs/MyForeachPartitionFunction.java  | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git 
a/samples/doris-demo/spark-demo/src/main/java/org/apache/doris/demo/spark/demo/hdfs/MyForeachPartitionFunction.java
 
b/samples/doris-demo/spark-demo/src/main/java/org/apache/doris/demo/spark/demo/hdfs/MyForeachPartitionFunction.java
index dfdfae8..8947a32 100644
--- 
a/samples/doris-demo/spark-demo/src/main/java/org/apache/doris/demo/spark/demo/hdfs/MyForeachPartitionFunction.java
+++ 
b/samples/doris-demo/spark-demo/src/main/java/org/apache/doris/demo/spark/demo/hdfs/MyForeachPartitionFunction.java
@@ -22,7 +22,7 @@ import com.alibaba.fastjson.JSONArray;
 import lombok.extern.slf4j.Slf4j;
 import org.apache.doris.demo.spark.util.DorisStreamLoad;
 import org.apache.doris.demo.spark.vo.TestVo;
-import scala.Function1;
+import scala.runtime.AbstractFunction1;
 import scala.collection.AbstractIterator;
 
 import java.io.Serializable;
@@ -32,7 +32,7 @@ import java.util.Map;
 
 
 @Slf4j
-public class MyForeachPartitionFunction implements Function1, Serializable {
+public class MyForeachPartitionFunction extends AbstractFunction1 implements 
Serializable {
 DorisStreamLoad dorisStreamLoad;
 
 public MyForeachPartitionFunction(Map parameters) {
@@ -68,4 +68,4 @@ public class MyForeachPartitionFunction implements Function1, 
Serializable {
 public Function1 compose(Function1 g) {
 return null;
 }
-}
\ No newline at end of file
+}

-
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org