The Flink community has a plan to delete the DataSet API in the future, the
requirements will be fulfilled by both
Table & DataStream API. It would be helpful to let us know what kind of
functionality is missing in these two APIs.
If you have further information you want to share, please let us kno
Thanks for the suggestions Kurt. Actually I could use Table Api I think,
it's just that most of our Flink code use DataSet Api.
Il dom 11 apr 2021, 13:44 Kurt Young ha scritto:
> Thanks for the suggestions Flavio. Join without window & left outer join
> already worked in Table API & SQL.
> And f
Thanks for the suggestions Flavio. Join without window & left outer join
already worked in Table API & SQL.
And for reduceGroup, you can try either user defined aggregate function or
use table aggregate which is
available in Table API now. I'm wondering whether these can meet your
requirement, or y
That's absolutely useful. IMHO also join should work without
windows/triggers and left/right outer joins should be easier in order to
really migrate legacy code.
Also reduceGroup would help but less urgent.
I hope that my feedback as Flink user could be useful.
Best,
Flavio
On Fri, Apr 9, 2021 at
Converting from table to DataStream in batch mode is indeed a problem now.
But I think this will
be improved soon.
Best,
Kurt
On Fri, Apr 9, 2021 at 6:14 PM Flavio Pompermaier
wrote:
> In my real CSV I have LONG columns that can contain null values. In that
> case I get a parse exception (and
In my real CSV I have LONG columns that can contain null values. In that
case I get a parse exception (and I would like to avoid to read it as a
string).
The ',bye' is just the way you can test that in my example (add that line
to the input csv).
If I use 'csv.null-literal' = '' it seems to work b
`format.ignore-first-line` is unfortunately a regression compared to the
old one.
I've created a ticket [1] to track this but according to current design, it
seems not easy to do.
Regarding null values, I'm not sure if I understand the issue you had. What
do you mean by
using ',bye' to test null L
And another thing: in my csv I added ',bye' (to test null Long values) but
I get a parse error..if I add 'csv.null-literal' = '' it seems to work..is
that the right way to solve this problem?
On Fri, Apr 9, 2021 at 10:13 AM Flavio Pompermaier
wrote:
> Thanks Kurt, now it works. However I can't
Thanks Kurt, now it works. However I can't find a way to skip the CSV
header..before there was "format.ignore-first-line" but now I can't find
another way to skip it.
I could set csv.ignore-parse-errors to true but then I can't detect other
parsing errors, otherwise I need to manually transofrm th
My DDL is:
CREATE TABLE csv (
id BIGINT,
name STRING
) WITH (
'connector' = 'filesystem',
'path' = '.',
'format' = 'csv'
);
Best,
Kurt
On Fri, Apr 9, 2021 at 10:00 AM Kurt Young wrote:
> Hi Flavio,
>
> We would recommend you to use new table source & sin
Hi Flavio,
We would recommend you to use new table source & sink interfaces, which
have different
property keys compared to the old ones, e.g. 'connector' v.s.
'connector.type'.
You can follow the 1.12 doc [1] to define your csv table, everything should
work just fine.
*Flink SQL> set table.dml-
Hi Till,
since I was using the same WITH-clause both for reading and writing I
discovered that overwrite is actually supported in the Sinks, while in the
Sources an exception is thrown (I was thinking that those properties were
simply ignored).
However the quote-character is not supported in the si
Hi Flavio,
I tried to execute the code snippet you have provided and I could not
reproduce the problem.
Concretely I am running this code:
final EnvironmentSettings envSettings = EnvironmentSettings.newInstance()
.useBlinkPlanner()
.inStreamingMode()
.build();
final TableEnvironment
Any help here? Moreover if I use the DataStream APIs there's no left/right
outer join yet..are those meant to be added in Flink 1.13 or 1.14?
On Wed, Apr 7, 2021 at 12:27 PM Flavio Pompermaier
wrote:
> Hi to all,
> I'm testing writing to a CSV using Flink 1.13 and I get the following
> error:
>
Hi to all,
I'm testing writing to a CSV using Flink 1.13 and I get the following error:
The matching candidates:
org.apache.flink.table.sinks.CsvBatchTableSinkFactory
Unsupported property keys:
format.quote-character
I create the table env using this:
final EnvironmentSettings envSettings = Envi
15 matches
Mail list logo