In our nightly build, we run all modules against Java 11. [1]
The only reason we do not compile with Java 11 is that we want to
specifically test that our finally released jars that are compiled with
Java 8 also work with Java 11.
So there should be no reason to stick with Java 8; documentation is likely
outdated.

The exception is that Hive and other Hadoop stuff does not support Java 11
in its entirety and we keep the Java 8 requirement to make it simple (= we
don't need to explain what runs with Java 11 and what not).

[1]
https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=16162&view=logs&j=f0ac5c25-1168-55a5-07ff-0e88223afed9

On Fri, Apr 9, 2021 at 2:06 PM Till Rohrmann <trohrm...@apache.org> wrote:

> I think the reason why Java 8 is written as a prerequisite is that not all
> Flink modules can compile/run with Java 11 if I am not mistaken. I think
> this affects mostly connectors [1].
>
> [1]
>
> https://ci.apache.org/projects/flink/flink-docs-stable/release-notes/flink-1.10.html#java-11-support-flink-10725
>
>
> Cheers,
> Till
>
> On Fri, Apr 9, 2021 at 11:42 AM Adam Roberts <adamrobertsah...@gmail.com>
> wrote:
>
> > Till - thanks!
> >
> > I was on that page and had a notification it had been updated. Scrolled
> > down to see the exact command I needed.
> >
> > This kinda output looks much better (am guessing the other suites *must*
> > run)?
> >
> > [INFO]
> > ------------------------------------------------------------------------
> > [INFO] Reactor Summary for Flink : 1.13-SNAPSHOT:
> > [INFO]
> > [INFO] Flink : Tools : Force Shading ...................... SUCCESS [
> >  2.021 s]
> > [INFO] Flink : ............................................ SUCCESS [
> > 17.440 s]
> > [INFO] Flink : Annotations ................................ SUCCESS [
> >  2.627 s]
> > [INFO] Flink : Test utils : ............................... SUCCESS [
> >  0.264 s]
> > [INFO] Flink : Test utils : Junit ......................... SUCCESS [
> >  2.417 s]
> > [INFO] Flink : Metrics : .................................. SUCCESS [
> >  0.272 s]
> > [INFO] Flink : Metrics : Core ............................. SUCCESS [
> >  3.200 s]
> > [INFO] Flink : Core ....................................... SUCCESS [
> > 29.040 s]
> > [INFO] Flink : Java ....................................... SUCCESS [
> >  7.156 s]
> > [INFO] Flink : Queryable state : .......................... SUCCESS [
> >  0.300 s]
> > [INFO] Flink : Queryable state : Client Java .............. SUCCESS [
> >  1.851 s]
> > [INFO] Flink : FileSystems : .............................. SUCCESS [
> >  0.417 s]
> > [INFO] Flink : FileSystems : Hadoop FS .................... SUCCESS [
> >  2.656 s]
> > [INFO] Flink : Runtime .................................... SUCCESS
> [01:33
> > min]
> > [INFO]
> > ------------------------------------------------------------------------
> > [INFO] BUILD SUCCESS
> > [INFO]
> > ------------------------------------------------------------------------
> > [INFO] Total time:  02:45 min
> > [INFO] Finished at: 2021-04-09T10:30:51+01:00
> > [INFO]
> > ------------------------------------------------------------------------
> >
> > There's just the bit about Java 8 now..which I'm sure isn't true...
> >
> > Cheers,
> >
> > On Fri, 9 Apr 2021 at 10:05, Till Rohrmann <trohrm...@apache.org> wrote:
> >
> > > Hi Adam,
> > >
> > > what works for me to run a single/set of tests is to use
> > >
> > > mvn verify -pl flink-runtime -Dtest='JobMaster*' -DfailIfNoTests=false
> > -am
> > >
> > > I will add it to the wiki.
> > >
> > > Concerning FLINK-21672, I think it would be really great to not use
> > vendor
> > > specific classes if possible. If you find a solution for it, then let's
> > > apply it.
> > >
> > > Cheers,
> > > Till
> > >
> > >
> > >
> > > On Thu, Apr 8, 2021 at 4:29 PM Adam Roberts <
> adamrobertsah...@gmail.com>
> > > wrote:
> > >
> > > > Hey everyone, I'm looking to get the full set of unit tests working
> > using
> > > > AdoptOpenJDK 11 with the OpenJ9 VM and I'm basically seeing problems
> > with
> > > > the runtime tests (always going OoM creating new threads) and I'd
> also
> > > like
> > > > to have a go at https://issues.apache.org/jira/browse/FLINK-21672.
> > > >
> > > > That being said... how do I run just the one test, or a set of tests
> in
> > > the
> > > > one package?
> > > >
> > > > What are you doing to achieve this?
> > > >
> > > > For Apache Spark I remember using mvn -fn
> > > -DwildcardSuites=org.apache.spark
> > > > test (the suite name), but with Flink that doesn't give me what I
> want
> > > > (lots more tests run, it's like the option is ignored - this was
> > several
> > > > years ago now though).
> > > >
> > > > I've also tried using Maven's ! directive but to no avail, I've been
> > > > through and tried
> > > >
> > > >
> > >
> >
> https://maven.apache.org/surefire/maven-surefire-plugin/examples/single-test.html
> > > > ,
> > > > and I've also tried mvn -Dtest=org.apache.flink.runtime* -fn test
> > > >
> > > > I'm wondering if anyone has an awesome example and could potentially
> > add
> > > it
> > > > to
> > > >
> > > >
> > >
> >
> https://cwiki.apache.org/confluence/display/FLINK/Setting+up+a+Flink+development+environment
> > > > as well please.
> > > >
> > > > While I'm here... I did notice as well that we mention Java 8 - I
> > assume
> > > > this can be Java 8 *or* 11? Or should it just say 11?
> > > >
> > > > Any thoughts/suggestions would be awesome, thanks!
> > > >
> > >
> >
>

Reply via email to