[
https://issues.apache.org/jira/browse/HIVE-8735?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14197979#comment-14197979
]
Hive QA commented on HIVE-8735:
-------------------------------
{color:red}Overall{color}: -1 at least one tests failed
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12679376/HIVE-8735.patch
{color:red}ERROR:{color} -1 due to 2 failed/errored test(s), 6671 tests executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_optimize_nullscan
org.apache.hive.hcatalog.listener.TestNotificationListener.testAMQListener
{noformat}
Test results:
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/1641/testReport
Console output:
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/1641/console
Test logs:
http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-1641/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 2 tests failed
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12679376 - PreCommit-HIVE-TRUNK-Build
> statistics update can fail due to long paths
> --------------------------------------------
>
> Key: HIVE-8735
> URL: https://issues.apache.org/jira/browse/HIVE-8735
> Project: Hive
> Issue Type: Bug
> Reporter: Sergey Shelukhin
> Assignee: Sergey Shelukhin
> Attachments: HIVE-8735.patch
>
>
> {noformat}
> 2014-11-04 01:34:38,610 ERROR jdbc.JDBCStatsPublisher
> (JDBCStatsPublisher.java:publishStat(198)) - Error during publishing
> statistics.
> java.sql.SQLDataException: A truncation error was encountered trying to
> shrink VARCHAR
> 'pfile:/grid/0/jenkins/workspace/UT-hive-champlain-common/sub&' to length 255.
> at
> org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
> Source)
> at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
> Source)
> at
> org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
> Source)
> at
> org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
> Source)
> at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
> Source)
> at org.apache.derby.impl.jdbc.ConnectionChild.handleException(Unknown
> Source)
> at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown
> Source)
> at
> org.apache.derby.impl.jdbc.EmbedPreparedStatement.executeStatement(Unknown
> Source)
> at
> org.apache.derby.impl.jdbc.EmbedPreparedStatement.executeLargeUpdate(Unknown
> Source)
> at
> org.apache.derby.impl.jdbc.EmbedPreparedStatement.executeUpdate(Unknown
> Source)
> at
> org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsPublisher$2.run(JDBCStatsPublisher.java:147)
> at
> org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsPublisher$2.run(JDBCStatsPublisher.java:144)
> at
> org.apache.hadoop.hive.ql.exec.Utilities.executeWithRetry(Utilities.java:2910)
> at
> org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsPublisher.publishStat(JDBCStatsPublisher.java:160)
> at
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.publishStats(FileSinkOperator.java:1153)
> at
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.closeOp(FileSinkOperator.java:992)
> at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:598)
> at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:610)
> at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:610)
> at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:610)
> at
> org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:205)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
> at
> org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
> at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
> at java.util.concurrent.FutureTask.run(FutureTask.java:166)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:722)
> Caused by: java.sql.SQLException: A truncation error was encountered trying
> to shrink VARCHAR
> 'pfile:/grid/0/jenkins/workspace/UT-hive-champlain-common/sub&' to length 255.
> at
> org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
> at
> org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
> Source)
> ... 31 more
> Caused by: ERROR 22001: A truncation error was encountered trying to shrink
> VARCHAR 'pfile:/grid/0/jenkins/workspace/UT-hive-champlain-common/sub&' to
> length 255.
> at org.apache.derby.iapi.error.StandardException.newException(Unknown
> Source)
> at org.apache.derby.iapi.types.SQLChar.hasNonBlankChars(Unknown Source)
> at org.apache.derby.iapi.types.SQLVarchar.normalize(Unknown Source)
> at org.apache.derby.iapi.types.SQLVarchar.normalize(Unknown Source)
> at org.apache.derby.iapi.types.DataTypeDescriptor.normalize(Unknown
> Source)
> at
> org.apache.derby.impl.sql.execute.NormalizeResultSet.normalizeColumn(Unknown
> Source)
> at
> org.apache.derby.impl.sql.execute.NormalizeResultSet.normalizeRow(Unknown
> Source)
> at
> org.apache.derby.impl.sql.execute.NormalizeResultSet.getNextRowCore(Unknown
> Source)
> at
> org.apache.derby.impl.sql.execute.DMLWriteResultSet.getNextRowCore(Unknown
> Source)
> at org.apache.derby.impl.sql.execute.InsertResultSet.open(Unknown
> Source)
> at
> org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown Source)
> at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown
> Source)
> ... 25 more
> {noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)