[ https://issues.apache.org/jira/browse/HIVE-6914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14210071#comment-14210071 ]
Hive QA commented on HIVE-6914: ------------------------------- {color:red}Overall{color}: -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12681303/HIVE-6914.2.patch {color:red}ERROR:{color} -1 due to 2 failed/errored test(s), 6687 tests executed *Failed tests:* {noformat} org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_parquet_complex_type_nested org.apache.hadoop.hive.ql.io.parquet.TestParquetSerDe.testParquetHiveSerDe {noformat} Test results: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/1773/testReport Console output: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/1773/console Test logs: http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-1773/ Messages: {noformat} Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 2 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12681303 - PreCommit-HIVE-TRUNK-Build > parquet-hive cannot write nested map (map value is map) > ------------------------------------------------------- > > Key: HIVE-6914 > URL: https://issues.apache.org/jira/browse/HIVE-6914 > Project: Hive > Issue Type: Bug > Components: File Formats > Affects Versions: 0.12.0, 0.13.0 > Reporter: Tongjie Chen > Labels: parquet, serialization > Attachments: HIVE-6914.1.patch, HIVE-6914.2.patch > > > // table schema (identical for both plain text version and parquet version) > desc hive> desc text_mmap; > m map> > // sample nested map entry > {"level1":{"level2_key1":"value1","level2_key2":"value2"}} > The following query will fail, > insert overwrite table parquet_mmap select * from text_mmap; > Caused by: parquet.io.ParquetEncodingException: This should be an > ArrayWritable or MapWritable: > org.apache.hadoop.hive.ql.io.parquet.writable.BinaryWritable@f2f8106 > at > org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriter.writeData(DataWritableWriter.java:85) > at > org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriter.writeArray(DataWritableWriter.java:118) > at > org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriter.writeData(DataWritableWriter.java:80) > at > org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriter.writeData(DataWritableWriter.java:82) > at > org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriter.write(DataWritableWriter.java:55) > at > org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriteSupport.write(DataWritableWriteSupport.java:59) > at > org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriteSupport.write(DataWritableWriteSupport.java:31) > at > parquet.hadoop.InternalParquetRecordWriter.write(InternalParquetRecordWriter.java:115) > at parquet.hadoop.ParquetRecordWriter.write(ParquetRecordWriter.java:81) > at parquet.hadoop.ParquetRecordWriter.write(ParquetRecordWriter.java:37) > at > org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.write(ParquetRecordWriterWrapper.java:77) > at > org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.write(ParquetRecordWriterWrapper.java:90) > at > org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:622) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:793) > at > org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:87) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:793) > at > org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:92) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:793) > at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:540) > ... 9 more -- This message was sent by Atlassian JIRA (v6.3.4#6332)