+1 for lots of streaming tests

On Mon, Mar 23, 2015 at 11:23 AM, Márton Balassi <balassi.mar...@gmail.com>
wrote:

> Thanks for looking into this, Stephan. +1 for the JIRAs.
>
> On Mon, Mar 23, 2015 at 10:55 AM, Ufuk Celebi <u...@apache.org> wrote:
>
> > On 23 Mar 2015, at 10:44, Stephan Ewen <se...@apache.org> wrote:
> >
> > > Hi everyone!
> > >
> > > With the streaming stuff getting heavier exposure, I think it needs a
> few
> > > more tests. With so many changes, untested features are running a high
> > risk
> > > of being "patched away" by accident.
> > >
> > > For the runtime and batch API part, we go with the policy that every
> new
> > > feature can only be merged is properly backed by tests. The streaming
> API
> > > should now follow the same paradigm, in my opinion.
> > >
> > > Here are some tests that I suggest to add
> > >
> > > 1) Settings/Configuration properly forwarded from ExecutionEnvironment
> to
> > > JobGraph
> > >
> > > 2) Isolated tests for heavy and critical utilities (like barrier
> buffer)
> > >
> > > 3) Tests on the behavior of the abstract streaming vertex. Checks that
> > >   - RuntimeContext is porperly initialized
> > >   - open() and close() is always called (on RichFunctions)
> > >   - cancelling
> > >   - close() is called when cancelling a job()
> > >   - forwarding of barriers
> > >
> > > 4) Tests for the JobGraph construction
> > >   - parallelism properly configured
> > >   - connections and partitioners properly set
> > >
> > > 5) Tests for the chaining construction
> > >   - functions that can be chained and that cannot be chained
> > >   - chaining should not be affected by parallelism settings
> > >
> > > Do you agree? Should I open a series of JIRAs for this?
> >
> > I agree with your assessment. +1 to open the respective issues to track
> > this.
> >
>

Reply via email to