zentol opened a new pull request #10954: [FLINK-15785][travis][e2e] Rework E2E test activations URL: https://github.com/apache/flink/pull/10954 This PR reworks the activation logic for java end-to-end tests. Introduces a new framework for specifying category inclusions/exclusions in `flink-end-to-end-tests`, that is capable of automatically aggregating properties across profiles. Inclusions are defined by setting the `e2e.include.<category-class>`, conversely exclusions by setting `e2e.exclude.<category-class>`. The value of the property is a boolean that controls whether the inclusion/exclusions is enabled or not. A groovy script searches for all properties matching the naming scheme, extracts the category class, aggregates inclusions/exclusions and sets the appropriate properties. **Background**: Maven does not really support combining categories. The surefire-plugin only accepts values like `catA,catB`, instead of a list where individual items have their own XML element. This implies that in order to combine category definitions one would have to concatenate the current value with some other value, i.e., do `categories = categories + "hello"`. Maven doesn't support this however since it is a recursive definition of a property. The following is a listing of every problem and how it was addressed: 1) **travis activations are spread out over multiple files (.travis.yml, travis_watchdoh.sh)** They are now consolidated in .travis.yml. While this does introduce some redundancy it is still preferable because .travis.yml is the most visible travis file, and it is easier to mvoe the pre-commit activation around. 2) **setting multiple categories in .travis.yml is incredibly verbose** Each category now has their dedicated profile in `flink-end-to-end-tests` that set the appropriate category. This is significantly more concise and provides an abstraction of the specific categories used. (In other words, we can modify things at will without breaking setups) 3) **hadoop tests are not excluded by default** Now disabled by default. 4) **hadoop exclusions aren't setup (causing FLINK-15685)** Are now setup. 5) **automatic java11 exclusion is prone to not working as intended** There is now a dedicated java11 profile in `flink-end-to-end-tests` to ensure the java11 exclusions is always applied.
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services