Hi Cory!

We found the problem. There is a development fork of Flink for Stream SQL,
whose CI infrastructure accidentally also deployed snapshots and overwrote
some of the proper master branch snapshots.

That's why the snapshots got inconsistent. We fixed that, and newer
snapshots should be online.
Hope that this is resolved now.

Sorry for the inconvenience,
Stephan


On Fri, Feb 12, 2016 at 12:51 AM, Stephan Ewen <se...@apache.org> wrote:

> Hi!
>
> The CI system has just finished uploading an new snapshot. In that one,
> the scalatest dependency is now correctly at 2.11 again.
>
>
> https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-test-utils_2.11/1.0-SNAPSHOT/flink-test-utils_2.11-1.0-20160211.232156-288.pom
>
> I am very puzzled, we did not touch any parts that seem to affect this. I
> am wondering if it is possible that Maven had a hiccup...
>
> Can you retry (force dependency update), see if the dependencies are
> correct again?
>
>
> Greetings,
> Stephan
>
>
> On Fri, Feb 12, 2016 at 12:23 AM, Stephan Ewen <se...@apache.org> wrote:
>
>> Hi!
>>
>> I examined the Apache Snapshot Repository, and I could see that in the
>> latest snapshot a "scalatest_2.10" version was introduced. I could not
>> figure out how, yet. I could not find a "flink-core_2.10" or
>> "flink-annotations_2.10" dependency, yet.
>>
>>
>> Previous snapshot:
>> https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-test-utils_2.11/1.0-SNAPSHOT/flink-test-utils_2.11-1.0-20160211.162913-286.pom
>>
>> Latest Snapshot:
>> https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-test-utils_2.11/1.0-SNAPSHOT/flink-test-utils_2.11-1.0-20160211.201205-287.pom
>>
>>
>> We'll try and fix this ASAP. Sorry for that, this is quite a mystery
>> right now...
>>
>> Best,
>> Stephan
>>
>> On Thu, Feb 11, 2016 at 11:56 PM, Cory Monty <cory.mo...@getbraintree.com
>> > wrote:
>>
>>> Ufuk,
>>>
>>> Thanks for the link. I've double-checked everything in our dependencies
>>> list and it's all correct.
>>>
>>> Stephan,
>>>
>>> We don't explicitly depend on "flink-java", so there should be no
>>> suffix. It's curious, to me, that scalatest is showing in the stack trace.
>>> I also tried clearing ~/.sbt/staging and it did not help. Our build server
>>> (CircleCI) is also experiencing the same issue, so I don't think it's local
>>> to my machine.
>>>
>>> On Thu, Feb 11, 2016 at 4:09 PM, Stephan Ewen <se...@apache.org> wrote:
>>>
>>>> Hi Cory!
>>>>
>>>> Hmmm, curios... I just double check the code, there are no more
>>>> references to a Scala-versioned "flink-core" and "flink-annotations"
>>>> project in the code base.
>>>>
>>>> The projects you use with Scala version suffix look good, actually.
>>>> Just to be safe, can you check that the "flink-java" dependency is without
>>>> suffix?
>>>>
>>>> One other thing I can imagine is a mixed up dependency cache. Can you
>>>> try to refresh all snapshot dependencies (maybe clear "~/.sbt/staging/").
>>>>
>>>>
>>>> It is high-time for a 1.0 release, so you need not work on the SNAPSHOT
>>>> versions any more. That should really solve this version conflict pain.
>>>> If we are fast tomorrow, there may be a nice surprise coming up in the
>>>> next days...
>>>>
>>>> Greetings,
>>>> Stephan
>>>>
>>>>
>>>> On Thu, Feb 11, 2016 at 10:24 PM, Cory Monty <
>>>> cory.mo...@getbraintree.com> wrote:
>>>>
>>>>> Hmm. We don't explicitly include "flink-annotations" and we do not
>>>>> append the Scala suffix for "flink-core":
>>>>>
>>>>> `"org.apache.flink" % "flink-core" % "1.0-SNAPSHOT"`
>>>>>
>>>>> Here are the packages we currently include with a Scala suffix:
>>>>>
>>>>> flink-scala
>>>>> flink-clients
>>>>> flink-streaming-scala
>>>>> flink-connector-kafka-0.8
>>>>> flink-test-utils
>>>>> flink-streaming-contrib
>>>>>
>>>>> If there is any documentation you can point to regarding when to
>>>>> include the Scala suffix on Flink packages, let me know.
>>>>>
>>>>>
>>>>>
>>>>> On Thu, Feb 11, 2016 at 2:55 PM, Stephan Ewen <se...@apache.org>
>>>>> wrote:
>>>>>
>>>>>> Hi Cory!
>>>>>>
>>>>>> "flink-core" and "flink-annotations" should not have Scala suffixes,
>>>>>> because they do not depend on Scala.
>>>>>>
>>>>>> So far, we mark the Scala independent projects without suffixes. Is
>>>>>> that very confusing, or does that interfere with build tools?
>>>>>>
>>>>>> Greetings,
>>>>>> Stephan
>>>>>>
>>>>>>
>>>>>> On Thu, Feb 11, 2016 at 9:50 PM, Cory Monty <
>>>>>> cory.mo...@getbraintree.com> wrote:
>>>>>>
>>>>>>> As of this afternoon, SBT is running into issues compiling with the
>>>>>>> following error:
>>>>>>>
>>>>>>> [error] Modules were resolved with conflicting cross-version
>>>>>>> suffixes in
>>>>>>> [error]    org.scalatest:scalatest _2.10, _2.11
>>>>>>> [error]    org.apache.flink:flink-core _2.11, <none>
>>>>>>> [error]    org.apache.flink:flink-annotations _2.11, <none>
>>>>>>> java.lang.RuntimeException: Conflicting cross-version suffixes in:
>>>>>>> org.scalatest:scalatest, org.apache.flink:flink-core,
>>>>>>> org.apache.flink:flink-annotations
>>>>>>> at scala.sys.package$.error(package.scala:27)
>>>>>>> at
>>>>>>> sbt.ConflictWarning$.processCrossVersioned(ConflictWarning.scala:46)
>>>>>>> at sbt.ConflictWarning$.apply(ConflictWarning.scala:32)
>>>>>>> at sbt.Classpaths$$anonfun$66.apply(Defaults.scala:1164)
>>>>>>> at sbt.Classpaths$$anonfun$66.apply(Defaults.scala:1161)
>>>>>>> at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
>>>>>>> at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
>>>>>>> at sbt.std.Transform$$anon$4.work(System.scala:63)
>>>>>>> at
>>>>>>> sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
>>>>>>> at
>>>>>>> sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
>>>>>>> at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
>>>>>>> at sbt.Execute.work(Execute.scala:235)
>>>>>>> at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
>>>>>>> at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
>>>>>>> at
>>>>>>> sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
>>>>>>> at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> Any thoughts are greatly appreciated!
>>>>>>>
>>>>>>> Cheers,
>>>>>>>
>>>>>>> Cory
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to