>
> Am Di., 16. Apr. 2019 um 10:07 Uhr schrieb Lifei Chen :
>
>> Hi, all:
>>
>> I have a `Order` table as follow:
>>
>> rowtime item price
>> === ==
>> 09:00 item1 10
>> 09:01 item2 15
>> 09:03 item
Hi, all:
I have a `Order` table as follow:
rowtime item price
=== ==
09:00 item1 10
09:01 item2 15
09:03 item1 20
I want to calculate the moving average price in past 5 minutes, and emit
the result for every record.
how to do this using flink sql?
R
There is a go cli for automating deploying and udpating flink jobs, you
can integrate Jenkins pipeline with it, maybe it helps.
https://github.com/ing-bank/flink-deployer
Navneeth Krishnan 于2019年4月9日周二 上午10:34写道:
> Hi All,
>
> We have some streaming jobs in production and today we manually de
> /opt/flink/conf/flink-conf.yaml
> exec /docker-entrypoint.sh "${FLINK_ROLE}"
> fi
>
> exec /docker-entrypoint.sh "${@}"
>
>
> and /opt/flink/conf/flink-conf.template.yaml has the environment variable
> substitution like so:
>
> fs.s3a.endpoint: ${FLINK_S3_ENDPO
Hi guys,
I am using flink 1.7.2 deployed by kubernetes, and I want to change the
configurations about flink, for example customize
`taskmanager.heap.size`.
Does flink support using environment variables to override configurations
in `conf/flink-conf.yaml` ?
/release-1.7/flink-libraries/flink-table/src/test/java/org/apache/flink/table/runtime/stream/sql/JavaSqlITCase.java
Best,
Lifei
Lifei Chen 于2019年2月28日周四 上午9:57写道:
> I am using flink v1.71 now, and can not find the library you suggested,
> is it deprecated?
>
> Lifei Chen 于2019年2月
I am using flink v1.71 now, and can not find the library you suggested,
is it deprecated?
Lifei Chen 于2019年2月28日周四 上午9:50写道:
> Thanks, I will try it !
>
> Congxian Qiu 于2019年2月27日周三 下午9:17写道:
>
>> Hi, Lifei
>>
>> Maybe org.apache.flink.table.runtime.s
Thanks, I will try it !
Congxian Qiu 于2019年2月27日周三 下午9:17写道:
> Hi, Lifei
>
> Maybe org.apache.flink.table.runtime.stream.sql.JavaSqlITCase can be
> helpful.
>
> Best,
> Congxian
>
>
> Lifei Chen 于2019年2月27日周三 下午4:20写道:
>
>> Hi, all:
>>
>> I f
Hi, all:
I finished a flink streaming job with flink sql, which read data from kafka
and write bach to elasticsearch.
I have no idea how to add a unit test for testing sql I wrote, any
suggestions?