This is, in fact, the expected behavior. Let me explain why:
In order for Flink to provide exactly-once guarantees, the input sources
must be able to rewind and then replay any events since the last checkpoint.
In the scenario you shared, the last checkpoint was checkpoint 2, which
occurred befor
Hi podunk,
no, this is currently not possible:
> Currently, the CSV schema is derived from table schema. [1]
So the Table schema is used to define how Jackson CSV parses the lines and
hence needs to be complete.
[1]
https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/forma
Hi Min Tu,
try clean install to make sure the build starts from scratch. Refresh maven
modules in IntelliJ after the build. If that doesn't work, try invalidating
IntelliJ caches and/or reimporting the project (remove .idea folder).
Best,
Alexander Fedulov
On Sun, Jul 10, 2022 at 12:59 AM Hemang
Hi podunk,
please share exceptions that you find in the log/ folder of your Flink
distribution.
The Taskmanger startup issues should be captured in the *-taskexecutor-*
files.
Best,
Alexander Fedulov
On Mon, Jul 11, 2022 at 5:42 PM Xuyang wrote:
> Hi, can you provide the error log so that we can
No. But I do not want any 'parquet'. I need CSV.
Which code should created file?
EnvironmentSettings settings = EnvironmentSettings
.newInstance()
//.inStreamingMode()
.inBatchMode()
.build();
final TableEnvir
No, I did not mean.
I said 'Does Table API connector, CSV, has some option to ignore some columns in source file?'
Sent: Monday, July 11, 2022 at 5:28 PM
From: "Xuyang"
To: pod...@gmx.com
Cc: user@flink.apache.org
Subject: Re:Re: Does Table API connector, csv, has some option to ignore so
Hi, can you provide the error log so that we can locate the problem?
在 2022-07-11 03:36:00,pod...@gmx.com 写道:
I run Flink in Windows and in version 1.15.1 Task Managers fails to start.
Works without problems in 1.14.5
Sent: Friday, July 08, 2022 at 12:18 AM
From: "David Anderson"
To: "dev"
Hi, if you think this is a good feature, what about put it into jira[1] and
start a discussion in dev-mail? Someone familiar with the relevant modules will
discuss with you.[1] https://issues.apache.org/jira/projects/FLINK/issues
At 2022-07-07 17:43:03, "James Sandys-Lumsdaine" wrote:
>Hello,
>
Hi, did you mean `insert into table1 select col1, col2, col3 ... from table2`?
If this doesn't meet your requirement, what about using UDF to custom what you
want in runtime.
--
Best!
Xuyang
在 2022-07-11 16:10:00,pod...@gmx.com 写道:
I want to control what I insert in table not
Hi, did you add the dependency of parquet[1]?
[1] https://mvnrepository.com/artifact/org.apache.flink/flink-sql-parquet
在 2022-07-11 18:33:38,pod...@gmx.com 写道:
This example?:
CREATETABLEkafka_table(user_idSTRING,order_amountDOUBLE,log_tsTIMESTAMP(3),WATERMARKFORlog_tsASlog_ts-INTERVAL'5'SEC
Hemanga, the issue is that the number of keys is unknown at the compile
time.
I ended up using yidan's suggestion and serialized all keys into a string.
Thanks for the suggestion.
Thomas
On Sun, Jul 10, 2022 at 7:05 PM yidan zhao wrote:
> You can use string, and serialize all keys to a string.
This example?:
CREATE TABLE kafka_table (
user_id STRING,
order_amount DOUBLE,
log_ts TIMESTAMP(3),
WATERMARK FOR log_ts AS log_ts - INTERVAL '5' SECOND
) WITH (...);
CREATE TABLE fs_table (
user_id STRING,
order_amount DOUBLE,
dt STRING,
`hour` STRING
) PARTITIONED BY (dt, `h
Thanks a lot Alexander and Tzu-Li for your answers, this helps a lot!!
Cheers,
Robin
Le ven. 8 juil. 2022 à 17:40, Tzu-Li (Gordon) Tai a
écrit :
> Hi Robin,
>
> Apart from what Alexander suggested, I think you could also try the
> following first:
> Let the job use a "new" Kafka source, which y
You can use the FileSink and set the format to csv. An example of FileSink:
https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/filesystem/#full-example
Best,
Lijie
于2022年7月11日周一 16:16写道:
>
> If I create dynamic table with:
>
>
> CREATE TABLE some_table (name STRING, scor
If I create dynamic table with:
CREATE TABLE some_table (name STRING, score INT)
WITH (
'format' = 'csv',
'...'
);
//do some other stuff here
Then how to save table result to CSV file?
Best,
Mike
I want to control what I insert in table not what I get from table.
Sent: Monday, July 11, 2022 at 3:37 AM
From: "Shengkai Fang"
To: pod...@gmx.com
Cc: "user"
Subject: Re: Does Table API connector, csv, has some option to ignore some columns
Hi.
In Flink SQL, you can select the column
16 matches
Mail list logo