[ 
https://issues.apache.org/jira/browse/FLINK-7047?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chesnay Schepler updated FLINK-7047:
------------------------------------
    Description: 
With the current build times once again hitting the timeout it is time to 
revisit our approach.

The current approach of splitting all tests by name, while easy to maintain or 
extend, has the big disadvantage that it's fairly binary in regards to the 
timeout: either we're below the timeout and all builds pass, or we're above and 
the entire merging process stalls. Furthermore, it requires all modules to be 
compiled.


I propose a different approach by which we bundle several modules, only execute 
the tests of these modules and only compile modules required for these tests.

5 groups are my current suggestion, which will result in 10 build profiles 
total.
The groups are:
# *core* - core flink modules like core,runtime,streaming-java,metrics,rocksdb
# *libraries* - flink-libraries and flink-storm
# *connectors* - flink-connectors, flink-connector-wikiedits, 
flink-tweet-inputformat
# *tests* - flink-tests
# *cluster+examples+dist* - flink-yarn, fink-yarn-tests, flink-mesos, 
flink-examples, flink-dist

To not increase the total number of profiles to ridiculous numbers i also 
propose to only test against 2 combinations of jdk+hadoop+scala:
# oraclejdk8 + hadoop 2.8.0 + scala 2.11
# openjdk7 + hadoop 2.4.1 + scala 2.10

My current estimate is that this will cause profiles to take at most 40 minutes.

  was:
With the current build times once again hitting the timeout it is time to 
revisit our approach.

The current approach of splitting all tests by name, while easy to maintain or 
extend, has the big disadvantage that it's fairly binary in regards to the 
timeout: either we're below the timeout and all builds pass, or we're above and 
the entire merging process stalls. 


I propose a different approach by which we bundle several modules, only execute 
the tests of these modules and only compile modules required for these tests.

5 groups are my current suggestion, which will result in 10 build profiles 
total.
The groups are:
# *core* - core flink modules like core,runtime,streaming-java,metrics,rocksdb
# *libraries* - flink-libraries and flink-storm
# *connectors* - flink-connectors, flink-connector-wikiedits, 
flink-tweet-inputformat
# *tests* - flink-tests
# *cluster+examples+dist* - flink-yarn, fink-yarn-tests, flink-mesos, 
flink-examples, flink-dist

To not increase the total number of profiles to ridiculous numbers i also 
propose to only test against 2 combinations of jdk+hadoop+scala:
# oraclejdk8 + hadoop 2.8.0 + scala 2.11
# openjdk7 + hadoop 2.4.1 + scala 2.10

My current estimate is that this will cause profiles to take at most 40 minutes.


> Reorganize build profiles
> -------------------------
>
>                 Key: FLINK-7047
>                 URL: https://issues.apache.org/jira/browse/FLINK-7047
>             Project: Flink
>          Issue Type: Improvement
>          Components: Tests, Travis
>    Affects Versions: 1.4.0
>            Reporter: Chesnay Schepler
>            Assignee: Chesnay Schepler
>             Fix For: 1.4.0
>
>
> With the current build times once again hitting the timeout it is time to 
> revisit our approach.
> The current approach of splitting all tests by name, while easy to maintain 
> or extend, has the big disadvantage that it's fairly binary in regards to the 
> timeout: either we're below the timeout and all builds pass, or we're above 
> and the entire merging process stalls. Furthermore, it requires all modules 
> to be compiled.
> I propose a different approach by which we bundle several modules, only 
> execute the tests of these modules and only compile modules required for 
> these tests.
> 5 groups are my current suggestion, which will result in 10 build profiles 
> total.
> The groups are:
> # *core* - core flink modules like core,runtime,streaming-java,metrics,rocksdb
> # *libraries* - flink-libraries and flink-storm
> # *connectors* - flink-connectors, flink-connector-wikiedits, 
> flink-tweet-inputformat
> # *tests* - flink-tests
> # *cluster+examples+dist* - flink-yarn, fink-yarn-tests, flink-mesos, 
> flink-examples, flink-dist
> To not increase the total number of profiles to ridiculous numbers i also 
> propose to only test against 2 combinations of jdk+hadoop+scala:
> # oraclejdk8 + hadoop 2.8.0 + scala 2.11
> # openjdk7 + hadoop 2.4.1 + scala 2.10
> My current estimate is that this will cause profiles to take at most 40 
> minutes.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to