Hey all,

1. it would be nice if we find more people to also do testing of the streaming 
API. I think it's especially good to have people on it, which did not use it 
before.

2. Just to make sure: the "assignee" field of each task is a list, i.e. we can 
and should have more people testing per task. ;-)

– Ufuk

On 08 Jun 2015, at 19:00, Chiwan Park <chiwanp...@icloud.com> wrote:

> Hi. I’m very excited about preparing a new major release. :)
> I just picked two tests. I will report status as soon as possible.
> 
> Regards,
> Chiwan Park
> 
>> On Jun 9, 2015, at 1:52 AM, Maximilian Michels <m...@apache.org> wrote:
>> 
>> Hi everyone!
>> 
>> As previously discussed, the Flink developer community is very eager to get
>> out a new major release. Apache Flink 0.9.0 will contain lots of new
>> features and many bugfixes. This time, I'll try to coordinate the release
>> process. Feel free to correct me if I'm doing something wrong because I
>> don't no any better :)
>> 
>> To release a great version of Flink to the public, I'd like to ask everyone
>> to test the release candidate. Recently, Flink has received a lot of
>> attention. The expectations are quite high. Only through thorough testing
>> we will be able to satisfy all the Flink users out there.
>> 
>> Below is a list from the Wiki that we use to ensure the legal and
>> functional aspects of a release [1]. What I would like you to do is pick at
>> least one of the tasks, put your name as assignee in the link below, and
>> report back once you verified it. That way, I hope we can quickly and
>> thoroughly test the release candidate.
>> 
>> https://docs.google.com/document/d/1BhyMPTpAUYA8dG8-vJ3gSAmBUAa0PBSRkxIBPsZxkLs/edit
>> 
>> Best,
>> Max
>> 
>> Git branch: release-0.9-rc1
>> Release binaries: http://people.apache.org/~mxm/flink-0.9.0-rc1/
>> Maven artifacts:
>> https://repository.apache.org/content/repositories/orgapacheflink-1037/
>> PGP public key for verifying the signatures:
>> http://pgp.mit.edu/pks/lookup?op=vindex&search=0xDE976D18C2909CBF
>> 
>> 
>> Legal
>> ====
>> 
>> L.1 Check if checksums and GPG files match the corresponding release files
>> 
>> L.2 Verify that the source archives do NOT contains any binaries
>> 
>> L.3 Check if the source release is building properly with Maven (including
>> license header check (default) and checkstyle). Also the tests should be
>> executed (mvn clean verify)
>> 
>> L.4 Verify that the LICENSE and NOTICE file is correct for the binary and
>> source release.
>> 
>> L.5 All dependencies must be checked for their license and the license must
>> be ASL 2.0 compatible (http://www.apache.org/legal/resolved.html#category-x)
>> * The LICENSE and NOTICE files in the root directory refer to dependencies
>> in the source release, i.e., files in the git repository (such as fonts,
>> css, JavaScript, images)
>> * The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer to
>> the binary distribution and mention all of Flink's Maven dependencies as
>> well
>> 
>> L.6 Check that all POM files point to the same version (mostly relevant to
>> examine quickstart artifact files)
>> 
>> L.7 Read the README.md file
>> 
>> 
>> Functional
>> ========
>> 
>> F.1 Run the start-local.sh/start-local-streaming.sh,
>> start-cluster.sh/start-cluster-streaming.sh, start-webclient.sh scripts and
>> verify that the processes come up
>> 
>> F.2 Examine the *.out files (should be empty) and the log files (should
>> contain no exceptions)
>> * Test for Linux, OS X, Windows (for Windows as far as possible, not all
>> scripts exist)
>> * Shutdown and verify there are no exceptions in the log output (after
>> shutdown)
>> * Check all start+submission scripts for paths with and without spaces
>> (./bin/* scripts are quite fragile for paths with spaces)
>> 
>> F.3 local mode (start-local.sh, see criteria below)
>> F.4 cluster mode (start-cluster.sh, see criteria below)
>> F.5 multi-node cluster (can simulate locally by starting two taskmanagers,
>> see criteria below)
>> 
>> Criteria for F.3 F.4 F.5
>> ----------------------------
>> * Verify that the examples are running from both ./bin/flink and from the
>> web-based job submission tool
>> * flink-conf.yml should define more than one task slot
>> * Results of job are produced and correct
>> ** Check also that the examples are running with the build-in data and
>> external sources.
>> * Examine the log output - no error messages should be encountered
>> ** Web interface shows progress and finished job in history
>> 
>> 
>> F.6 Test on a cluster with HDFS.
>> * Check that a good amount of input splits is read locally (JobManager log
>> reveals local assignments)
>> 
>> F.7 Test against a Kafka installation
>> 
>> F.8 Test the ./bin/flink command line client
>> * Test "info" option, paste the JSON into the plan visualizer HTML file,
>> check that plan is rendered
>> * Test the parallelism flag (-p) to override the configured default
>> parallelism
>> 
>> F.9 Verify the plan visualizer with different browsers/operating systems
>> 
>> F.10 Verify that the quickstarts for scala and java are working with the
>> staging repository for both IntelliJ and Eclipse.
>> * In particular the dependencies of the quickstart project need to be set
>> correctly and the QS project needs to build from the staging repository
>> (replace the snapshot repo URL with the staging repo URL)
>> * The dependency tree of the QuickStart project must not contain any
>> dependencies we shade away upstream (guava, netty, ...)
>> 
>> F.11 Run examples on a YARN cluster
>> 
>> F.12 Run all examples from the IDE (Eclipse & IntelliJ)
>> 
>> F.13 Run an example with the RemoteEnvironment against a cluster started
>> from the shell script
>> 
>> F.14 Run manual Tests in "flink-tests" module.
>> * Marked with the @Ignore interface.
>> 
>> 
>> [1] https://cwiki.apache.org/confluence/display/FLINK/Releasing
> 
> 
> 
> 
> 

Reply via email to