Github user wuchong commented on the issue:

    https://github.com/apache/flink/pull/3141
  
    Hi @lincoln-lil , thank your for the PR. I think the issue is only for 
1.2.0, so please create a new pull request to commit into `release-1.2`.  For 
`master` we will directly fix the bug. 
    
    I have taken a quick look at it. I think a better approach is to disable 
outer joins with non-equality in 
[`DataSetJoinRule`](https://github.com/apache/flink/blob/master/flink-libraries/flink-table/src/main/scala/org/apache/flink/table/plan/rules/dataSet/DataSetJoinRule.scala#L43),
  not mix the code in SQL translation. such as 
    
    ```scala
    override def matches(call: RelOptRuleCall): Boolean = {
        val join: LogicalJoin = call.rel(0).asInstanceOf[LogicalJoin]
    
        val joinInfo = join.analyzeCondition
    
        // joins require an equi-condition or a conjunctive predicate with at 
least one equi-condition
        // and outer joins with non-equality predicates is not supported 
currently
        !joinInfo.pairs().isEmpty && (join.getJoinType == JoinRelType.INNER || 
joinInfo.isEqui)
      }
    ```
    
     And also please add checks in 
[`operators.scala`](https://github.com/apache/flink/blob/release-1.2/flink-libraries/flink-table/src/main/scala/org/apache/flink/table/plan/logical/operators.scala#L455)
 to give a good exception for TableAPI outer joins. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to