[
https://issues.apache.org/jira/browse/SQOOP-3010?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Venkat Ranganathan resolved SQOOP-3010.
---------------------------------------
Resolution: Fixed
Fix Version/s: 1.4.7
Thanks [~sowmyaramesh] for your contribution
> Sqoop should not allow --as-parquetfile with hcatalog jobs or when hive
> import with create-hive-table is used
> -------------------------------------------------------------------------------------------------------------
>
> Key: SQOOP-3010
> URL: https://issues.apache.org/jira/browse/SQOOP-3010
> Project: Sqoop
> Issue Type: Bug
> Affects Versions: 1.4.6
> Reporter: Sowmya Ramesh
> Assignee: Sowmya Ramesh
> Fix For: 1.4.7
>
>
> sqoop import ... --create-hcatalog-table --hcatalog-table --as-parquetfile
> {noformat}
> Error: java.lang.RuntimeException: Should never be used
> at
> org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getRecordWriter(MapredParquetOutputFormat.java:76)
> at
> org.apache.hive.hcatalog.mapreduce.FileOutputFormatContainer.getRecordWriter(FileOutputFormatContainer.java:103)
> {noformat}
> This should not run and should display a validation error as it does for both
> --as-sequencefile and --as-avrodatafile but the job runs and fails later with
> RuntimeException
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)