In our nightly build, we run all modules against Java 11. [1]
The only reason we do not compile with Java 11 is that we want to
specifically test that our finally released jars that are compiled with
Java 8 also work with Java 11.
So there should be no reason to stick with Java 8; documentation is
I think the reason why Java 8 is written as a prerequisite is that not all
Flink modules can compile/run with Java 11 if I am not mistaken. I think
this affects mostly connectors [1].
[1]
https://ci.apache.org/projects/flink/flink-docs-stable/release-notes/flink-1.10.html#java-11-support-flink-107
Till - thanks!
I was on that page and had a notification it had been updated. Scrolled
down to see the exact command I needed.
This kinda output looks much better (am guessing the other suites *must*
run)?
[INFO]
[INFO] Rea
Hi Adam,
what works for me to run a single/set of tests is to use
mvn verify -pl flink-runtime -Dtest='JobMaster*' -DfailIfNoTests=false -am
I will add it to the wiki.
Concerning FLINK-21672, I think it would be really great to not use vendor
specific classes if possible. If you find a solution
Thanks Robert for this dev blog post. It's a good read.
Cheers,
Till
On Mon, Mar 23, 2020 at 10:24 PM Arvid Heise wrote:
> Thank you Robert! (also thanks for incorporating my feedback so swiftly)
>
> On Mon, Mar 23, 2020 at 8:54 PM Seth Wiesman wrote:
>
> > Very interesting! No questions but t
Thank you Robert! (also thanks for incorporating my feedback so swiftly)
On Mon, Mar 23, 2020 at 8:54 PM Seth Wiesman wrote:
> Very interesting! No questions but thank you for taking the initiative to
> put out the first dev blog.
>
> Seth
>
> > On Mar 23, 2020, at 5:14 AM, Robert Metzger wrote
Very interesting! No questions but thank you for taking the initiative to put
out the first dev blog.
Seth
> On Mar 23, 2020, at 5:14 AM, Robert Metzger wrote:
>
> Hi all,
>
> I have just published the first post to the dev blog:
> https://cwiki.apache.org/confluence/display/FLINK/2020/03/22
Hi Haifeng,
Thank you for your willingness to contribute to the Flink community! I've
given you contributor permissions!
We communicate offline to correct your JIRA id is Spafka. :) Just confirm
that your profile is:
https://issues.apache.org/jira/secure/ViewProfile.jspa?name=Spafka
Please le
Hi,
Thanks @Fabian and @Xingcan for the explanation.
@Xingcan Here I mean I have a data analytics server that has *data tables*.
So my initial requirement is to make a client connector for Flink to access
that* data tables*.Then I started with implementing Flink InputFormat
Interface and that was
Hi Pawan,
@Fabian was right and I thought it was stream environment. Sorry for that.
What do you mean by `read the available records of my datasource`? How do
you implement the nextRecord() method in DASInputFormat?
Best,
Xingcan
On Wed, Mar 1, 2017 at 4:45 PM, Fabian Hueske wrote:
> Hi Pawa
Hi Pawan,
in the DataSet API DataSet.print() will trigger the execution (you do not
need to call ExecutionEnvironment.execute()).
The DataSet will be printed on the standard out of the process that submits
the program. This does only work for small DataSets.
In general print() should only be used
Hi,
So how can I read the available records of my datasource. I saw in some
examples that print() method will print the available data of that
datasource. ( like files )
Thanks,
Pawan
On Wed, Mar 1, 2017 at 11:30 AM, Xingcan Cui wrote:
> Hi Pawan,
>
> in Flink, most of the methods for DataSet
Hi Pawan,
in Flink, most of the methods for DataSet (including print()) will just add
operators to the plan but not really run it. If the DASInputFormat has no
error, you can run the plan by calling environment.execute().
Best,
Xingcan
On Wed, Mar 1, 2017 at 12:17 PM, Pawan Manishka Gunarathna <
Hi,
Thanks a lot for Fabian and Flavio.Those information really helpful.
On Tue, Jan 24, 2017 at 3:36 PM, Flavio Pompermaier
wrote:
> If your column on which you want to perform the split is numeric you can
> use the NumericBetweenParametersProvider interface that automatically
> computes th
If your column on which you want to perform the split is numeric you can
use the NumericBetweenParametersProvider interface that automatically
computes the splits for you. This is an example of its usage (at windows of
1000 items at a time) taken from the test class *JDBCInputFormatTest*:
final in
Hi,
JdbcInputFormat implements the InputFormat interface and is handled exactly
like any other InputFormat.
In contrast to file-based input formats, users must explicitly specify the
input splits by providing an array of parameter values which are injected
into a parameterized query.
This is done
Hi,
Thanks for your help. Since Our data source has database tables
architecture I have a thought of follow that 'JDBCInputFormat' in Flink. It
would be great if you can provide some information regarding how that
JDBCInputFormat execution happens?
Thanks,
Pawan
On Mon, Jan 23, 2017 at 4:18 PM, F
Hi Pawan,
I don't this this works. The InputSplits are generated by the JobManager,
i.e., not in parallel by a single process.
After the parallel InputFormats have been started on the TaskManagers, they
request InputSplits and open() them. If there are no InputSplits there is
no work to be done an
Hi,
Thanks Fabian and Chesnay for providing those information.
Pawan
On Wed, Jan 18, 2017 at 2:11 PM, Chesnay Schepler
wrote:
> Hello,
>
> The dependencies are fine.
>
> The short answer is i would recommend you to read op on java generics.
>
> The long answer is that OT and T are just placeho
Hello,
The dependencies are fine.
The short answer is i would recommend you to read op on java generics.
The long answer is that OT and T are just placeholders for types that
are supposed to be replaced.
You can either provide the type in your implementation:
(in this example, the ReadFromFi
Hi Pawan,
If you want to read a file, you might want to extend the FileInputFormat
class. It has already a lot of file-related functionality implemented.
OT is the type of the records produced by the InputFormat. For example
Tuple2 if the input format produce a tuple with two fields
of String and
Hi,
Yeah I also wrote in the way you have written..
public class ReadFromFile implements InputFormat{
}
Is that a problem with that declaration or dependencies ?
Thanks,
Pawan
On Tue, Jan 17, 2017 at 7:56 PM, Chesnay Schepler
wrote:
> Hello,
>
> Did you write something like this?
>
>p
Hello,
Did you write something like this?
public class MyInputFormat implements InputFormat {
}
Regards,
Chesnay
On 17.01.2017 04:18, Pawan Manishka Gunarathna wrote:
Hi,
I'm currently working on Flink InputFormat Interface implementation. I'm
writing a java program to rea
Somebody moderated this email through to this list. That was an error.
Let's be more accurate here.
On Mon, Mar 16, 2015 at 6:31 AM, E-ZPass Agent wrote:
> Dear Dev,
>
> You have not paid for driving on a toll road.
> You are kindly asked to pay your debt as soon as possible.
>
> The copy of
LOL
On Monday, March 16, 2015, Stephan Ewen wrote:
> Thank you. We promise that we will never do this again.
>
> Once we can dig up a few nuts that we buried last autumn, we'll use them to
> pay for the ticket...
>
>
> On Mon, Mar 16, 2015 at 1:31 PM, E-ZPass Agent <
> ruben.dav...@h1.faust.net.
Thank you. We promise that we will never do this again.
Once we can dig up a few nuts that we buried last autumn, we'll use them to
pay for the ticket...
On Mon, Mar 16, 2015 at 1:31 PM, E-ZPass Agent wrote:
> Dear Dev,
>
> You have not paid for driving on a toll road.
> You are kindly asked t
seems legit
On Mon, Mar 16, 2015 at 1:31 PM, E-ZPass Agent wrote:
> Dear Dev,
>
> You have not paid for driving on a toll road.
> You are kindly asked to pay your debt as soon as possible.
>
> The copy of the invoice is attached to this email.
>
> Kind regards,
> Ruben Davies,
> E-ZPass Agent.
>
27 matches
Mail list logo