[ https://issues.apache.org/jira/browse/HIVE-14113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15353399#comment-15353399 ]
Hive QA commented on HIVE-14113: -------------------------------- Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12814010/HIVE-14113.1.patch {color:red}ERROR:{color} -1 due to no test(s) being added or modified. {color:red}ERROR:{color} -1 due to 4 failed/errored test(s), 10258 tests executed *Failed tests:* {noformat} TestMiniTezCliDriver-vectorization_13.q-tez_bmj_schema_evolution.q-schema_evol_text_nonvec_mapwork_part_all_primitive.q-and-12-more - did not produce a TEST-*.xml file org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver_vector_complex_all org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver_vector_complex_join org.apache.hive.minikdc.TestJdbcNonKrbSASLWithMiniKdc.org.apache.hive.minikdc.TestJdbcNonKrbSASLWithMiniKdc {noformat} Test results: https://builds.apache.org/job/PreCommit-HIVE-MASTER-Build/289/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-MASTER-Build/289/console Test logs: http://ec2-50-18-27-0.us-west-1.compute.amazonaws.com/logs/PreCommit-HIVE-MASTER-Build-289/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 4 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12814010 - PreCommit-HIVE-MASTER-Build > Create function failed but function in show function list > --------------------------------------------------------- > > Key: HIVE-14113 > URL: https://issues.apache.org/jira/browse/HIVE-14113 > Project: Hive > Issue Type: Bug > Components: UDF > Affects Versions: 1.2.0 > Reporter: niklaus xiao > Assignee: Navis > Fix For: 1.3.0 > > Attachments: HIVE-14113.1.patch > > > 1. create function with invalid hdfs path, /udf/udf-test.jar does not exists > {quote} > create function my_lower as 'com.tang.UDFLower' using jar > 'hdfs:///udf/udf-test.jar'; > {quote} > Failed with following exception: > {quote} > 0: jdbc:hive2://189.39.151.44:10000/> create function my_lower as > 'com.tang.UDFLower' using jar 'hdfs:///udf/udf-test.jar'; > INFO : converting to local hdfs:///udf/udf-test.jar > ERROR : Failed to read external resource hdfs:///udf/udf-test.jar > java.lang.RuntimeException: Failed to read external resource > hdfs:///udf/udf-test.jar > at > org.apache.hadoop.hive.ql.session.SessionState.downloadResource(SessionState.java:1384) > at > org.apache.hadoop.hive.ql.session.SessionState.resolveAndDownload(SessionState.java:1340) > at > org.apache.hadoop.hive.ql.session.SessionState.add_resources(SessionState.java:1264) > at > org.apache.hadoop.hive.ql.session.SessionState.add_resources(SessionState.java:1250) > at > org.apache.hadoop.hive.ql.exec.FunctionTask.addFunctionResources(FunctionTask.java:306) > at > org.apache.hadoop.hive.ql.exec.Registry.registerToSessionRegistry(Registry.java:466) > at > org.apache.hadoop.hive.ql.exec.Registry.registerPermanentFunction(Registry.java:206) > at > org.apache.hadoop.hive.ql.exec.FunctionRegistry.registerPermanentFunction(FunctionRegistry.java:1551) > at > org.apache.hadoop.hive.ql.exec.FunctionTask.createPermanentFunction(FunctionTask.java:136) > at > org.apache.hadoop.hive.ql.exec.FunctionTask.execute(FunctionTask.java:75) > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:158) > at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:101) > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1965) > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1723) > at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1475) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1283) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1278) > at > org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:167) > at > org.apache.hive.service.cli.operation.SQLOperation.access$200(SQLOperation.java:75) > at > org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:245) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1711) > at > org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:258) > at > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.io.FileNotFoundException: File does not exist: > hdfs:/udf/udf-test.jar > at > org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1391) > at > org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1383) > at > org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) > at > org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1383) > at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:340) > at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:292) > at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2034) > at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2003) > at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1979) > at > org.apache.hadoop.hive.ql.session.SessionState.downloadResource(SessionState.java:1370) > ... 28 more > ERROR : Failed to register default.my_lower using class com.tang.UDFLower > Error: Error while processing statement: FAILED: Execution Error, return code > 1 from org.apache.hadoop.hive.ql.exec.FunctionTask (state=08S01,code=1) > {quote} > 2. Execute show functions, the failed function my_lower is in the function > list > {quote} > 0: jdbc:hive2://189.39.151.44:21066/> show functions; > +-------------------------+--+ > | tab_name | > +-------------------------+--+ > | day | > | dayofmonth | > | decode | > | default.my_lower | > | degrees | > | dense_rank | > 0: jdbc:hive2://189.39.151.44:10000/> select my_lower(name) from stu; > Error: Error while compiling statement: FAILED: SemanticException [Error > 10011]: Invalid function my_lower (state=42000,code=10011) > {quote} -- This message was sent by Atlassian JIRA (v6.3.4#6332)