zhuzhurk commented on a change in pull request #7255: [FLINK-10945] Use 
InputDependencyConstraint to avoid resource dead…
URL: https://github.com/apache/flink/pull/7255#discussion_r246435193
 
 

 ##########
 File path: 
flink-runtime/src/main/java/org/apache/flink/runtime/executiongraph/ExecutionVertex.java
 ##########
 @@ -726,6 +730,41 @@ void sendPartitionInfos() {
                }
        }
 
+       /**
+        * Check whether the InputDependencyConstraint is satisfied for this 
vertex.
+        *
+        * @return whether the input constraint is satisfied
+        */
+       public boolean checkInputDependencyConstraints() {
+               if (getExecutionGraph().getInputDependencyConstraint() == 
InputDependencyConstraint.ANY) {
 
 Review comment:
   I've move `InputDependencyConstraint ` to JobVertex. And the job wide 
default value can be configured in `ExecutionConfig`. But I haven't make it 
configurable through DataSet/DataStream API yet.
   
   I agree we should support the constraint configurable for each operator. But 
I'm not quite sure whether we should support it with DataSet API or later for 
the stream/batch unified StreamGraph/Transformation API? Could you share your 
suggestion?
   
   In our production experience, a job-wide configured input constraint 
satisfies most users, together with `BATCH_FORCED` execution mode, to ensure a 
batch job can finish with limited resources.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to