Hi Remi, hi all,
I'd like to add some information from open source projects and why I
don't see the problem discussed here is a really serious one.
Background: We tested Apache Lucene and Apache Solr with Java 21. The
compilation with Gradle went fine. So actually there are no problems
with the new superclasses. We have extensive use of chains of stream()
calls with Stream.of() and similar apis. Use of "var" is still seldom
but we use it now when newly introduced code around streams is added to
spare verbosity. But still we got no problems. But why is this so?
A good open source project should trigger the compiler with "--release".
Apache Lucene uses Java 17 on main branch and Java 11 on 9.x branch. In
both cases compilation worked due to the use of "--release". If we would
change to Java 21 as compilation target, we may need to adapt our code.
There are some problems with that:
* Not all projects use "--release", some projects still use "--source
--target". The problem with that is Maven and Gradle still not
making "--release" a first class citizen. Default configs only use
"--source --target".
* Code still on Java 8 can't use "--release", as the compiler does not
support it. The Lucene 8.x branch still open for bugfixes has a
trick: It detects the compiler and if it is Java 8 it passes
"--source 8 --target 8", while starting with Java 9 compiler it
passes "--release 8". On the other hand code still supporting java 8
is unlikely affected by the problem, as it cannot use "var". But
still chains of Stream.of().foo().bar() may still be affected.
What is a more serious source-incompatibility issue that I would always
report to OpenJDK bug tracker: During testing Java 20 we were trapped by
a compiler change that caused a source incompatibility (which was
reverted, see https://bugs.openjdk.org/browse/JDK-8299416). So not even
passing "--release" fixed the issue, because the compiler changed its
semantics. This is in my opinion a breaking issue because it prevents
code from compiling!
The changes in sequenced collections should not be a too big problem for
the community if they have setup their projects correctly.
Uwe
P.S.: To be honest: I tried to pass "--release 21" when compiling Lucene
and it failed, but not for sequenced collections reasons. It was more
some tests calling Runtime#runFinalization().
Am 05.05.2023 um 13:14 schrieb fo...@univ-mlv.fr:
Hi Joe,
in this peculiar case, there are several reasons to be worried
compared to other potential breaking changes that has appeared in the
past (see the message of Tagir for an example).
Unlike other changes
- this one touch the collection API, and those interfaces/types are
widely used,
- we know that the source compatibility changes occurs mostly if 'var'
or the "new" inference algorithm (the one from Java 8), so this is
likely that most of the issues will be found in Java 11+ source code,
- this changes may also affect all typed languages based on the JVM,
not only Java. Corpus of codes in Groovy, Kotlin and Scala also need
to be checked. In case of Kotlin and Scala, 'var' is the default
behavior but they have their own collections (or type system around
collections in case of Kotlin), so knowing the real impact of this
change is hard here.
The problem of using a corpus experiment is that the corpus may not
represent the current state of the Java ecosystem, or at least the one
that may be impacted.
The problem with the corpus experiment is also that you need to be aware
that most moden open source projects use "--release" flag, so you have
to patch it away from the build system.
In my case, on my own repositories (public and private), i had only
one occurrence of the issue in the main source codes because many of
those repositories are not using 'var' or even the stream API but on
the corpus of the unit tests we give to students to check their
implementations, little less than a third of those JUnit classes had
source compatibility issues because those tests are using 'var' and
different collections heavily.
And the situation is a little worst than that because in between now
and the time people will use Java 21, a lot of codes will be written
using Java 11 and 17 and may found incompatible later.
A source incompatibility issue is not a big deal, as said in this
thread, most of the time, explicitly fixing the type argument instead
of inferring it make the code compile again.
So the house is not burning, but we should raise awareness of this
issue given that it may have a bigger impact than other source
incompatible changes that occur previously.
Rémi
------------------------------------------------------------------------
*From: *"joe darcy" <joe.da...@oracle.com>
*To: *"Ethan McCue" <et...@mccue.dev>, "Raffaello Giulietti"
<raffaello.giulie...@oracle.com>
*Cc: *"Remi Forax" <fo...@univ-mlv.fr>, "Stuart Marks"
<stuart.ma...@oracle.com>, "core-libs-dev"
<core-libs-...@openjdk.java.net>
*Sent: *Friday, May 5, 2023 4:38:16 AM
*Subject: *Re: The introduction of Sequenced collections is not a
source compatible change
A few comments on the general compatibility policy for the JDK.
Compatibility is looked after by the Compatibility and
Specification Review (CSR) process ( Compatibility & Specification
Review). Summarizing the approach,
The general compatibility policy for exported APIs implemented
in the JDK is:
* Don't break binary compatibility (as defined in the Java
Language Specification) without sufficient cause.
* Avoid introducing source incompatibilities.
* Manage behavioral compatibility changes.
https://wiki.openjdk.org/display/csr/Main
None of binary, source, and behavioral compatibly are absolutes
and judgement is used to assess the cost/benefits of changes. For
example, strict source compatibility would preclude, say,
introducing new public types in the java.lang package since the
implicit import of types in java.lang could conflict with a
same-named type *-imported from another package.
When a proposed change is estimated to be sufficiently disruptive,
we conduct a corpus experiment to evaluate the impact on the
change on many public Java libraries. Back in Project Coin in JDK
7, that basic approach was used to help quantify various language
design choices and the infrastructure to run such experiments has
been built-out in the subsequent releases.
HTH,
-Joe
CSR Group Lead
On 5/4/2023 6:32 AM, Ethan McCue wrote:
I guess this a good time to ask, ignoring the benefit part of
a cost benefit analysis, what mechanisms do we have to measure
the number of codebases relying on type inference this will
break?
Iirc Adoptium built/ran the unit tests of a bunch of public
repos, but it's also a bit shocking if the jtreg suite had
nothing for this.
On Thu, May 4, 2023, 9:27 AM Raffaello Giulietti
<raffaello.giulie...@oracle.com> wrote:
Without changing the semantics at all, you could also write
final List<Collection<String>> list =
Stream.<Collection<String>>of(nestedDequeue,
nestedList).toList();
to "help" type inference.
On 2023-05-03 15:12, fo...@univ-mlv.fr wrote:
> Another example sent to me by a fellow French guy,
>
> final Deque<String> nestedDequeue = new ArrayDeque<>();
> nestedDequeue.addFirst("C");
> nestedDequeue.addFirst("B");
> nestedDequeue.addFirst("A");
>
> final List<String> nestedList = new ArrayList<>();
> nestedList.add("D");
> nestedList.add("E");
> nestedList.add("F");
>
> final List<Collection<String>> list =
Stream.of(nestedDequeue, nestedList).toList();
>
> This one is cool because no 'var' is involved and using
collect(Collectors.toList()) instead of toList() solves
the inference problem.
>
> Rémi
>
> ----- Original Message -----
>> From: "Stuart Marks" <stuart.ma...@oracle.com>
>> To: "Remi Forax" <fo...@univ-mlv.fr>
>> Cc: "core-libs-dev" <core-libs-...@openjdk.java.net>
>> Sent: Tuesday, May 2, 2023 2:44:28 AM
>> Subject: Re: The introduction of Sequenced collections
is not a source compatible change
>
>> Hi Rémi,
>>
>> Thanks for trying out the latest build!
>>
>> I'll make sure this gets mentioned in the release note
for Sequenced
>> Collections.
>> We'll also raise this issue when we talk about this
feature in the Quality
>> Outreach
>> program.
>>
>> s'marks
>>
>> On 4/29/23 3:46 AM, Remi Forax wrote:
>>> I've several repositories that now fails to compile
with the latest jdk21, which
>>> introduces sequence collections.
>>>
>>> The introduction of a common supertype to existing
collections is *not* a source
>>> compatible change because of type inference.
>>>
>>> Here is a simplified example:
>>>
>>> public static void m(List<Supplier<? extends
Map<String, String>>> factories) {
>>> }
>>>
>>> public static void main(String[] args) {
>>> Supplier<LinkedHashMap<String,String>> supplier1 =
LinkedHashMap::new;
>>> Supplier<SortedMap<String,String>> supplier2 =
TreeMap::new;
>>> var factories = List.of(supplier1, supplier2);
>>> m(factories);
>>> }
>>>
>>>
>>> This example compiles fine with Java 20 but report an
error with Java 21:
>>> SequencedCollectionBug.java:28: error: method m in
class SequencedCollectionBug
>>> cannot be applied to given types;
>>> m(factories);
>>> ^
>>> required: List<Supplier<? extends Map<String,String>>>
>>> found: List<Supplier<? extends
SequencedMap<String,String>>>
>>> reason: argument mismatch; List<Supplier<? extends
SequencedMap<String,String>>>
>>> cannot be converted to List<Supplier<? extends
Map<String,String>>>
>>>
>>>
>>>
>>> Apart from the example above, most of the failures I
see are in the unit tests
>>> provided to the students, because we are using a lot
of 'var' in them so they
>>> work whatever the name of the types chosen by the
students.
>>>
>>> Discussing with a colleague, we also believe that this
bug is not limited to
>>> Java, existing Kotlin codes will also fail to compile
due to this bug.
>>>
>>> Regards,
>>> Rémi
--
Uwe Schindler
uschind...@apache.org
ASF Member, Member of PMC and Committer of Apache Lucene and Apache Solr
Bremen, Germany
https://lucene.apache.org/
https://solr.apache.org/