Could you share your code?
Best,
Guowei
On Mon, Nov 23, 2020 at 12:05 PM tkg_cangkul wrote:
> Hi,
>
> i'm using java for do this thing.
> and i've success to register the tables.
>
> i've success to select each table.
>
> Table result1 = tEnv.sqlQuery("select status_code from table_kafka");
> T
Sure. I have created one: https://issues.apache.org/jira/browse/FLINK-20286
On Mon, 23 Nov 2020 at 12:19, eef hhj wrote:
> Hi Jark,
>
> Thank you for your helpful and quick response, I will try the hive
> connector solution. It will be great if you can point out the link tracking
> the filesyste
Hi Jark,
Thank you for your helpful and quick response, I will try the hive
connector solution. It will be great if you can point out the link tracking
the filesystem streaming data source.
On Mon, Nov 23, 2020 at 10:33 AM Jark Wu wrote:
> Hi Kai,
>
> Streaming filesystem source is not supporte
Hi,
i'm using java for do this thing.
and i've success to register the tables.
i've success to select each table.
Table result1 = tEnv.sqlQuery("select status_code from table_kafka");
Table result2 = tEnv.sqlQuery("select status_code from table_mysql_reff");
but when i try join query i've some
Hi Kai,
Streaming filesystem source is not supported yet in TableAPI/SQL.
This is on the roadmap and there are some problems that need to be fixed.
As a workaround, you can use Hive connector to reading files continuously on
filesystems [1].
Best,
Jark
[1]:
https://ci.apache.org/projects/flink/f
Hi Kai,
I took a look at the implementation of the filesystem connector. It will
decide which files to read at startup
and won't change during running. If you want to need this function, you may
need to customize a new connector.
Best,
Xingbo
eef hhj 于2020年11月21日周六 下午2:38写道:
> Hi,
>
> I'm faci
Hi
One way would look like as following
1. create the probe table from Kafka as following. You could find more
detailed information from doc[1]
CREATE TABLE myTopic (
id BIGINT,
item_id BIGINT,
category_id BIGINT,
behavior STRING,
ts TIMESTAMP(3)) WITH (
'connector' = 'kafka',
'topic' = 'us
I think what we need in the Native Kubernetis Config is to mount custom
ConfigMap, Secrets, and Volumes
I see that in the upcoming release, Secrets are able to get mounted
https://github.com/apache/flink/pull/14005 <- also can maintainers look
into this PR so we can mount other custom K8S resourc