Hi Aegeaner,

I have replied you on FLINK-7036.

Regards,
Jark

2017-06-29 23:45 GMT+08:00 郭健 <guo.j...@immomo.com>:

> Thanks Jark,
> I have captured the generated code as follow, noticed one line: `int
> result$4;` , maybe it’s the result of ExpressionReducer’s type casting.
>
>
>       public class ExpressionReducer$6
>           extends org.apache.flink.api.common.functions.RichMapFunction {
>
>
>         transient org.apache.flink.types.Row out =
>             new org.apache.flink.types.Row(1);
>
>
>
>         public ExpressionReducer$6() throws Exception {
>
>
>         }
>
>
>
>
>         @Override
>         public void open(org.apache.flink.configuration.Configuration
> parameters) throws Exception {
>
>
>         }
>
>         @Override
>         public Object map(Object _in1) throws Exception {
>           org.apache.flink.types.Row in1 = (org.apache.flink.types.Row)
> _in1;
>
>
>
>
>
>
>
>
>           java.lang.String result$0 = "01,5,2013";
>           boolean isNull$1 = false;
>
>
>           java.lang.String result$2 = "%d,%m,%Y";
>           boolean isNull$3 = false;
>
>           boolean isNull$5 = isNull$1 || isNull$3;
>           int result$4;
>           if (isNull$5) {
>             result$4 = -1;
>           }
>           else {
>             result$4 =
>           org.apache.flink.table.functions.utils.
> DateTimeFunctions.strToDate(result$0, result$2)
>           ;
>           }
>
>           if (isNull$5) {
>             out.setField(0, null);
>           }
>           else {
>             out.setField(0, result$4);
>           }
>
>           return out;
>
>         }
>
>         @Override
>         public void close() throws Exception {
>
>
>         }
>       }
>
>
>
> On 6/29/17, 20:24, "Jark Wu" <j...@apache.org> wrote:
>
>     That's wired. Can you print the generated code ?
>
>     Add this line before
>     https://github.com/apache/flink/blob/master/flink-
> libraries/flink-table/src/main/scala/org/apache/flink/table/codegen/
> ExpressionReducer.scala#L96
>
>     println(generatedFunction.code)
>
>     2017-06-29 14:50 GMT+08:00 郭健 <guo.j...@immomo.com>:
>
>     > Hi Jark Wu,
>     >         I did see the wrong result type failed the CodeGen compile
> stage,
>     > it throws out exception stack as follow, even before the
> ExpressionReducer
>     > actually restores the origin return
>     >     type:
>     >
>     > org.apache.flink.api.common.InvalidProgramException: Table program
> cannot
>     > be compiled. This is a bug. Please file an issue.
>     >
>     >         at org.apache.flink.table.codegen.Compiler$class.
>     > compile(Compiler.scala:36)
>     >         at org.apache.flink.table.codegen.ExpressionReducer.
>     > compile(ExpressionReducer.scala:38)
>     >         at org.apache.flink.table.codegen.ExpressionReducer.
>     > reduce(ExpressionReducer.scala:96)
>     >         at org.apache.calcite.rel.rules.ReduceExpressionsRule.
>     > reduceExpressionsInternal(ReduceExpressionsRule.java:549)
>     >         at org.apache.calcite.rel.rules.ReduceExpressionsRule.
>     > reduceExpressions(ReduceExpressionsRule.java:470)
>     >         at org.apache.calcite.rel.rules.ReduceExpressionsRule.
>     > reduceExpressions(ReduceExpressionsRule.java:447)
>     >         at org.apache.calcite.rel.rules.ReduceExpressionsRule$
>     > ProjectReduceExpressionsRule.onMatch(ReduceExpressionsRule.java:270)
>     >         at org.apache.calcite.plan.AbstractRelOptPlanner.fireRule(
>     > AbstractRelOptPlanner.java:317)
>     >         at org.apache.calcite.plan.hep.HepPlanner.applyRule(
>     > HepPlanner.java:506)
>     >         at org.apache.calcite.plan.hep.HepPlanner.applyRules(
>     > HepPlanner.java:385)
>     >         at org.apache.calcite.plan.hep.
> HepPlanner.executeInstruction(
>     > HepPlanner.java:251)
>     >         at org.apache.calcite.plan.hep.HepInstruction$RuleInstance.
>     > execute(HepInstruction.java:125)
>     >         at org.apache.calcite.plan.hep.HepPlanner.executeProgram(
>     > HepPlanner.java:210)
>     >         at org.apache.calcite.plan.hep.HepPlanner.findBestExp(
>     > HepPlanner.java:197)
>     >         at org.apache.flink.table.expressions.utils.
> ExpressionTestBase.
>     > addSqlTestExpr(ExpressionTestBase.scala:194)
>     >         at org.apache.flink.table.expressions.utils.
>     > ExpressionTestBase.testSqlApi(ExpressionTestBase.scala:277)
>     >         at org.apache.flink.table.expressions.ScalarFunctionsTest.
>     > testStrToDate(ScalarFunctionsTest.scala:1516)
>     >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>     >         at sun.reflect.NativeMethodAccessorImpl.invoke(
>     > NativeMethodAccessorImpl.java:62)
>     >         at sun.reflect.DelegatingMethodAccessorImpl.invoke(
>     > DelegatingMethodAccessorImpl.java:43)
>     >         at java.lang.reflect.Method.invoke(Method.java:498)
>     >         at org.junit.runners.model.FrameworkMethod$1.
> runReflectiveCall(
>     > FrameworkMethod.java:50)
>     >         at org.junit.internal.runners.model.ReflectiveCallable.run(
>     > ReflectiveCallable.java:12)
>     >         at org.junit.runners.model.FrameworkMethod.
> invokeExplosively(
>     > FrameworkMethod.java:47)
>     >         at org.junit.internal.runners.statements.InvokeMethod.
>     > evaluate(InvokeMethod.java:17)
>     >         at org.junit.internal.runners.statements.RunBefores.
>     > evaluate(RunBefores.java:26)
>     >         at org.junit.internal.runners.statements.RunAfters.evaluate(
>     > RunAfters.java:27)
>     >         at org.junit.runners.ParentRunner.runLeaf(
> ParentRunner.java:325)
>     >         at org.junit.runners.BlockJUnit4ClassRunner.runChild(
>     > BlockJUnit4ClassRunner.java:78)
>     >         at org.junit.runners.BlockJUnit4ClassRunner.runChild(
>     > BlockJUnit4ClassRunner.java:57)
>     >         at org.junit.runners.ParentRunner$3.run(
> ParentRunner.java:290)
>     >         at org.junit.runners.ParentRunner$1.schedule(
> ParentRunner.java:71)
>     >         at org.junit.runners.ParentRunner.runChildren(
>     > ParentRunner.java:288)
>     >         at org.junit.runners.ParentRunner.access$000(
> ParentRunner.java:58)
>     >         at org.junit.runners.ParentRunner$2.evaluate(
>     > ParentRunner.java:268)
>     >         at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
>     >         at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>     >         at com.intellij.junit4.JUnit4IdeaTestRunner.
> startRunnerWithArgs(
>     > JUnit4IdeaTestRunner.java:68)
>     >         at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.
>     > startRunnerWithArgs(IdeaTestRunner.java:51)
>     >         at com.intellij.rt.execution.junit.JUnitStarter.
>     > prepareStreamsAndStart(JUnitStarter.java:242)
>     >         at com.intellij.rt.execution.junit.JUnitStarter.main(
>     > JUnitStarter.java:70)
>     > Caused by: org.codehaus.commons.compiler.CompileException: Line 49,
>     > Column 23: Assignment conversion not possible from type
> "java.sql.Date" to
>     > type "int"
>     >         at org.codehaus.janino.UnitCompiler.compileError(
>     > UnitCompiler.java:11672)
>     >         at org.codehaus.janino.UnitCompiler.assignmentConversion(
>     > UnitCompiler.java:10528)
>     >         at org.codehaus.janino.UnitCompiler.compile2(
>     > UnitCompiler.java:3452)
>     >         at org.codehaus.janino.UnitCompiler.access$5200(
>     > UnitCompiler.java:212)
>     >         at org.codehaus.janino.UnitCompiler$9.
>     > visitAssignment(UnitCompiler.java:3416)
>     >         at org.codehaus.janino.UnitCompiler$9.
>     > visitAssignment(UnitCompiler.java:3396)
>     >         at org.codehaus.janino.Java$Assignment.accept(Java.java:
> 4300)
>     >         at org.codehaus.janino.UnitCompiler.compile(
>     > UnitCompiler.java:3396)
>     >         at org.codehaus.janino.UnitCompiler.compile2(
>     > UnitCompiler.java:2316)
>     >         at org.codehaus.janino.UnitCompiler.access$1700(
>     > UnitCompiler.java:212)
>     >         at org.codehaus.janino.UnitCompiler$6.
> visitExpressionStatement(
>     > UnitCompiler.java:1450)
>     >         at org.codehaus.janino.UnitCompiler$6.
> visitExpressionStatement(
>     > UnitCompiler.java:1443)
>     >         at org.codehaus.janino.Java$ExpressionStatement.accept(
>     > Java.java:2848)
>     >         at org.codehaus.janino.UnitCompiler.compile(
>     > UnitCompiler.java:1443)
>     >         at org.codehaus.janino.UnitCompiler.compileStatements(
>     > UnitCompiler.java:1523)
>     >         at org.codehaus.janino.UnitCompiler.compile2(
>     > UnitCompiler.java:1509)
>     >         at org.codehaus.janino.UnitCompiler.access$1600(
>     > UnitCompiler.java:212)
>     >         at org.codehaus.janino.UnitCompiler$6.visitBlock(
>     > UnitCompiler.java:1449)
>     >         at org.codehaus.janino.UnitCompiler$6.visitBlock(
>     > UnitCompiler.java:1443)
>     >         at org.codehaus.janino.Java$Block.accept(Java.java:2753)
>     >         at org.codehaus.janino.UnitCompiler.compile(
>     > UnitCompiler.java:1443)
>     >         at org.codehaus.janino.UnitCompiler.compile2(
>     > UnitCompiler.java:2424)
>     >         at org.codehaus.janino.UnitCompiler.access$1800(
>     > UnitCompiler.java:212)
>     >         at org.codehaus.janino.UnitCompiler$6.
>     > visitIfStatement(UnitCompiler.java:1451)
>     >         at org.codehaus.janino.UnitCompiler$6.
>     > visitIfStatement(UnitCompiler.java:1443)
>     >         at org.codehaus.janino.Java$IfStatement.accept(Java.java:
> 2923)
>     >         at org.codehaus.janino.UnitCompiler.compile(
>     > UnitCompiler.java:1443)
>     >         at org.codehaus.janino.UnitCompiler.compileStatements(
>     > UnitCompiler.java:1523)
>     >         at org.codehaus.janino.UnitCompiler.compile(
>     > UnitCompiler.java:3052)
>     >         at org.codehaus.janino.UnitCompiler.compileDeclaredMethods(
>     > UnitCompiler.java:1313)
>     >         at org.codehaus.janino.UnitCompiler.compileDeclaredMethods(
>     > UnitCompiler.java:1286)
>     >         at org.codehaus.janino.UnitCompiler.compile2(
>     > UnitCompiler.java:785)
>     >         at org.codehaus.janino.UnitCompiler.compile2(
>     > UnitCompiler.java:436)
>     >         at org.codehaus.janino.UnitCompiler.access$400(
>     > UnitCompiler.java:212)
>     >         at org.codehaus.janino.UnitCompiler$2.
>     > visitPackageMemberClassDeclaration(UnitCompiler.java:390)
>     >         at org.codehaus.janino.UnitCompiler$2.
>     > visitPackageMemberClassDeclaration(UnitCompiler.java:385)
>     >         at org.codehaus.janino.Java$PackageMemberClassDeclaration.
>     > accept(Java.java:1405)
>     >         at org.codehaus.janino.UnitCompiler.compile(
> UnitCompiler.java:385)
>     >         at org.codehaus.janino.UnitCompiler.compileUnit(
>     > UnitCompiler.java:357)
>     >         at org.codehaus.janino.SimpleCompiler.cook(
>     > SimpleCompiler.java:234)
>     >         at org.codehaus.janino.SimpleCompiler.compileToClassLoader(
>     > SimpleCompiler.java:446)
>     >         at org.codehaus.janino.SimpleCompiler.cook(
>     > SimpleCompiler.java:213)
>     >         at org.codehaus.janino.SimpleCompiler.cook(
>     > SimpleCompiler.java:204)
>     >         at org.codehaus.commons.compiler.
> Cookable.cook(Cookable.java:80)
>     >         at org.codehaus.commons.compiler.
> Cookable.cook(Cookable.java:75)
>     >         at org.apache.flink.table.codegen.Compiler$class.
>     > compile(Compiler.scala:33)
>     >         ... 40 more
>     >
>     >
>     >
>     >
>     >
>     > On 6/29/17, 14:37, "Jark Wu" <j...@apache.org> wrote:
>     >
>     >     Hi Aegeaner,
>     >
>     >     First of all, the ExpressionReducer actually restores the origin
> return
>     >     type after reducing, see
>     >     https://github.com/apache/flink/blob/master/flink-
>     > libraries/flink-table/src/main/scala/org/apache/flink/table/codegen/
>     > ExpressionReducer.scala#L122
>     >
>     >     So the reduced result and type should be correct. Did you find
> the
>     > wrong
>     >     return type?
>     >
>     >     The `RexBuilder.makeLiteral(Object value, RelDataType type,
> boolean
>     >     allowCast)` accepts any values and will cast Integer back to Date
>     >     internally.
>     >
>     >     Regards,
>     >     Jark Wu
>     >
>     >
>     >     2017-06-29 12:38 GMT+08:00 郭健 <guo.j...@immomo.com>:
>     >
>     >     > Hi all,
>     >     >             I am implementing a STR_TO_DATE scalar SQL
> function to
>     > flink,
>     >     > and found return type casted from java.sql.Date to Integer in
> Flink’s
>     >     > ExpressionReducer:
>     >     > https://github.com/apache/flink/blob/master/flink-
>     >     > libraries/flink-table/src/main/scala/org/apache/flink/
> table/codegen/
>     >     > ExpressionReducer.scala#L56
>     >     >
>     >     > // we need to cast here for RexBuilder.makeLiteral
>     >     >
>     >     >
>     >     >       case (SqlTypeName.DATE, e) =>
>     >     >
>     >     >
>     >     >         Some(
>     >     >
>     >     >
>     >     >           rexBuilder.makeCast(typeFactory.
> createTypeFromTypeInfo(
>     > BasicTypeInfo.INT_TYPE_INFO),
>     >     > e)
>     >     >
>     >     >
>     >     >         )
>     >     >
>     >     >
>     >     >
>     >     >             so str_to_date('01,5,2013','%d,%m,%Y')" must
> return an
>     >     > Integer, which conflicted with my implementation.
>     >     >
>     >     >             My question is: why should we do this? I have seen
> in
>     > comments
>     >     > the reason to do this here is: “we need to cast here for
>     >     > RexBuilder.makeLiteral”, But is it reasonale to change user
>     > function’s
>     >     > return Type? Should we restore the origin return type after the
>     > reduce?
>     >     >
>     >     >
>     >     > Thanks,
>     >     > Aegeaner
>     >     >
>     >     >
>     >     >
>     >
>     >
>     >
>
>
>

Reply via email to