@Stefano: No problem. Always happy when people test releases :-)
On Fri, Mar 4, 2016 at 12:27 PM, Stefano Baghino <
stefano.bagh...@radicalbit.io> wrote:
> Ok, I switched back to 2.10 with the script and tried with both the
> explicit call to the script back to 2.11 and with the implicit call via
Ok, I switched back to 2.10 with the script and tried with both the
explicit call to the script back to 2.11 and with the implicit call via
-Dscala-11 worked, I really don't know what happened before. Thank you for
the help, sorry for disturbing the voting process.
On Fri, Mar 4, 2016 at 12:12 PM,
You are right. Just checked the docs. They are correct.
@Stefano: the docs say that you first change the binary version via
the script and then you can specify the language version via
scala.version.
On Fri, Mar 4, 2016 at 11:51 AM, Stephan Ewen wrote:
> Are the docs actually wrong?
>
> In the d
I'll switch back to Scala 2.10 and try again, I was sure I ran the script
before running the build; maybe something went wrong and I didn't notice.
On Fri, Mar 4, 2016 at 11:51 AM, Stephan Ewen wrote:
> Are the docs actually wrong?
>
> In the docs, it says to run the "tools/change-scala-version.
Are the docs actually wrong?
In the docs, it says to run the "tools/change-scala-version.sh 2.11" script
first (which implicitly adds the "-Dscala-2.11" flag.
I thought this problem arose because neither the flag was specified, nor
the script run.
On Fri, Mar 4, 2016 at 11:43 AM, Ufuk Celebi wr
@Stefano: Yes, would be great to have a fix in the docs and pointers
on how to improve the docs for this.
On Fri, Mar 4, 2016 at 11:41 AM, Stefano Baghino
wrote:
> Build successful, thank you.
>
> On Fri, Mar 4, 2016 at 11:24 AM, Stefano Baghino <
> stefano.bagh...@radicalbit.io> wrote:
>
>> I'll
Build successful, thank you.
On Fri, Mar 4, 2016 at 11:24 AM, Stefano Baghino <
stefano.bagh...@radicalbit.io> wrote:
> I'll try it immediately, thanks for the quick feedback and sorry for the
> intrusion. Should I add this to the docs? The flag seem to be
> -Dscala.version=2.11.x on them:
> http
I'll try it immediately, thanks for the quick feedback and sorry for the
intrusion. Should I add this to the docs? The flag seem to be
-Dscala.version=2.11.x on them:
https://ci.apache.org/projects/flink/flink-docs-master/setup/building.html#scala-versions
On Fri, Mar 4, 2016 at 11:20 AM, Stephan
AFAIK, you should run `tools/change-scala-version.sh 2.11` before running `mvn
clean install -DskipTests -Dscala-2.11`.
Regards,
Chiwan Park
> On Mar 4, 2016, at 7:20 PM, Stephan Ewen wrote:
>
> Sorry, the flag is "-Dscala-2.11"
>
> On Fri, Mar 4, 2016 at 11:19 AM, Stephan Ewen wrote:
>
>>
Sorry, the flag is "-Dscala-2.11"
On Fri, Mar 4, 2016 at 11:19 AM, Stephan Ewen wrote:
> Hi!
>
> To compile with Scala 2.11, please use the "-Dscala.version=2.11" flag.
> Otherwise the 2.11 specific build profiles will not get properly activated.
>
> Can you try that again?
>
> Thanks,
> Stephan
Hi!
To compile with Scala 2.11, please use the "-Dscala.version=2.11" flag.
Otherwise the 2.11 specific build profiles will not get properly activated.
Can you try that again?
Thanks,
Stephan
On Fri, Mar 4, 2016 at 11:17 AM, Stefano Baghino <
stefano.bagh...@radicalbit.io> wrote:
> I won't ca
I won't cast a vote as I'm not entirely sure this is just a local problem
(and from the document the Scala 2.11 build has been checked), however I've
checked out the `release-1.0-rc5` branch and ran `mvn clean install
-DskipTests -Dscala.version=2.11.7`, with a failure on `flink-runtime`:
[ERROR]
+1
Checked LICENSE and NOTICE files
Built against Hadoop 2.6, Scala 2.10, all tests are good
Run local pseudo cluster with examples
Log files look good, no exceptions
Tested File State Backend
Ran Storm Compatibility Examples
-> minor issue, one example fails (no release blocker in my opinion)
+1
- Checked checksums and signatures
- Verified no binaries in source release
- Checked that source release is building properly
- Build for custom Hadoop version
- Ran start scripts
- Checked log and out files
- Tested in local mode
- Tested in cluster mode
- Tested on cluster with HDFS
- Tested
+1
Checked that the sources don't contain binaries
Tested cluster execution with flink/run and web client job submission
Run all examples via FliRTT
Tested Kafka 0.9
Verified that quickstarts work with Eclipse and IntelliJ
Run example with RemoteEnvironment
Verified SBT quickstarts
On Thu, Mar 3,
+1
I think we have a winner. :D
The “boring” tests from the checklist should still hold for this RC and I now
ran a custom windowing job with state on RocksDB on Hadoop 2.7 with Scala 2.11.
I used the Yarn HA mode and shot down both JobManagers and TaskManagers and the
job restarted successful
Apparently I was not careful enough when writing the email.
The release branch is "release-1.0.0-rc5" and its the fifth RC.
On Thu, Mar 3, 2016 at 2:01 PM, Robert Metzger wrote:
> Dear Flink community,
>
> Please vote on releasing the following candidate as Apache Flink version
> 1.0.0.
>
> This
17 matches
Mail list logo