Hi All,
I have 100 millions data is stored in Doris. I testing rullup performance as
the following:
The fist test is was normal and performace improved a lot after created rollup.
--
# no create rollup --- costs 4.73 sec
SELECT key1, key2, SUM(
Hi Mingyu,
Got it. Thanks a lot for your information!
发送自 Windows 10 版邮件应用
发件人: 陈明雨
发送时间: 2019年9月4日 8:58
收件人: dev@doris.apache.org
抄送: tan zhongyi
主题: Re:Re: Release plan of Doris
Hi Katte:
The Doris project is currently developing the following functions:
1. Decouple Storage and Compute
Hi All,
I want to know release plan or roadmap about doris. Is there anyone who can
provide? Thanks a lot!
发送自 Windows 10 版邮件应用
发件人: tan zhongyi
发送时间: 2019年8月31日 7:48
收件人: dev@doris.apache.org
主题: Re: doris 后续发布计划
Hi, katte,
Since we are in global open source community,
And there are
您好!请问一下,doris后续的发布路线图或新版本发布计划 是否有?谢谢!
发送自 Windows 10 版邮件应用
你好,我加载数据到doris中时提示 too many filtered rows ,是不是记录条数太多了呢
root@master:/mywork/data/T4# curl --location-trusted -u fm:Fm123456 -H
"label:tb04_20190828_1" -H "column_separator:," -T T4_1.txt
http://slave1:8040/api/futurmaster/tb04/_stream_load
{
"TxnId": 2002,
"Label": "tb04_20190828_1",
你好,我在导入数据时提示 too many filtered rows,是不是因为记录条数太多了呢?
导入的文本文件,在61条记录,导入时提示如下:
root@master:/mywork/data/T4# curl --location-trusted -u fm:Fm123456 -H
"label:tb04_20190828_1" -H "column_separator:," -T T4_1.txt
http://slave1:8040/api/futurmaster/tb04/_stream_load
{
"TxnId": 2002,
"Label
您好!
你好,我创建了一个job从kafka读取数据,但日志总提示读取超时,还请帮助一下,谢谢!
2019-08-22 14:37:07,351 WARN 30
[RoutineLoadTaskScheduler.submitBatchTasksIfNotEmpty():201] task send error.
backend[10002]
org.apache.thrift.transport.TTransportException:
java.net.SocketTimeoutException: Read timed out
at
org.apache.t