I'm hoping one or more of the Apache Hive experts monitoring this list can
help me debug an issue with DECIMAL type in my external table that has me
stumped. Here's the context:

--- The Context ---

Version: Hive 2.1.1 (from CDH6)

I'm mapping a Hive external table to a data stored in a backend database
(Oracle NoSQL Database). The mapping has worked for a number of years;
until we decided support the Hive DECIMAL type. The create table command
being used looks like:

hive> CREATE EXTERNAL TABLE IF NOT EXISTS vehicleTable (TYPE STRING, MAKE
STRING, MODEL STRING, CLASS STRING, COLOR STRING, PRICE DOUBLE, COUNT INT,
DEALERID DECIMAL (8,3), DELIVERED TIMESTAMP) COMMENT 'Hive
vehicleTable::KVStore vehicleTable' STORED BY
'oracle.kv.hadoop.hive.table.TableStorageHandler' TBLPROPERTIES
("oracle.kv.kvstore" = "mystore", "oracle.kv.hosts" =
"<mydbhost1>:5000,<mydbhost2>:5000,<mydbhost3>:5000","oracle.kv.tableName"
= "vehicleTable");
OK
Time taken: 2.051 seconds

NOTE: In the past, we supported all of the data types shown above, except
DECIMAL and TIMESTAMP. We added TIMESTAMP first and had no problems. As
described below, problems only occurred when the DECIMAL type was added to
the external table. And we found that it didn't matter whether the default
precision and scale were used (that is, "DEALERID DECIMAL") or we specified
a precision and scale like that specified in the create table command above.

--- The Problem ---

When a simple query is executed against the external table described above,
the following error occurs:

hive> select * from vehicletable;
FAILED: SemanticException java.lang.IllegalArgumentException: Decimal
precision out of allowed range [1,38]

--- The Analysis ---

At first I thought that the numerical values stored in the backend database
were the problem. So I played around with a number of different precisions
and scales; always making sure the scale was less than the precision.

When this had no effect on the result, I began to analyze the source code
of the classes referenced in the stack trace; specifically, the following
classes and methods:

Caused by: java.lang.IllegalArgumentException: Decimal precision out of
allowed range [1,38]
at
org.apache.hadoop.hive.serde2.typeinfo.HiveDecimalUtils.validateParameter(HiveDecimalUtils.java:43)
at
org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.parseType(TypeInfoUtils.java:460)
at
org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.parseTypeInfos(TypeInfoUtils.java:329)
at
org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils.getTypeInfosFromTypeString(TypeInfoUtils.java:818)
at
org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.extractColumnInfo(LazySerDeParameters.java:160)
at
org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.<init>(LazySerDeParameters.java:90)
at
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initialize(LazySimpleSerDe.java:116)
at
org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:58)
at
org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:531)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genConversionSelectOperator(SemanticAnalyzer.java:7262)
... 26 more

Upon analyzing the method HiveDecimalUtils.validateParameter, I found that
the exception above is thrown when the following condition is met:

    if (precision < 1 || precision > HiveDecimal.MAX_PRECISION) {
      throw new IllegalArgumentException("Decimal precision out of allowed
range [1," +
          HiveDecimal.MAX_PRECISION + "]");
    }

This is what made me experiment with the smaller precision values; thinking
that the problem was with the data stored in the backend database. When
none of these changes resulted in different behavior, and when further
analysis of the other classes in the stacktrace indicated that Hive wasn't
even getting to the data stored in the backend, I started looking more
closely at the contents of the hive.log file for clues.

The log ouput shows that for some reason the precision and scale that was
specified in the create table command (ex. decimal(8,3)) is changed to
decimal(0,0) before getting to the HiveDecimalUtils.validateParameter
method. So this explains why the exception is being thrown, but it doesn't
explain why those values were both changed to 0. To see this, look at the
following snippet of the log file [note: carriage-returns were inserted
between log records only for readability]:

.....
2020-01-11T11:03:37,135  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: Completed getting MetaData in Semantic Analysis

2020-01-11T11:03:37,138 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
hive.log: DDL: struct vehicletable { string type, string make, string
model, string class, string color, double price, i32 count, decimal(8,3)
dealerid, timestamp delivered}
2020-01-11T11:03:37,202 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
lazy.LazySerDeParameters: oracle.kv.hadoop.hive.table.TableSerDeBase
initialized with: columnNames=[type, make, model, class, color, price,
count, dealerid, delivered] columnTypes=[string, string, string, string,
string, double, int, decimal(8,3), timestamp] separator=[[B@62d40e31]
nullstring=\N lastColumnTakesRest=false timestampFormats=null

2020-01-11T11:03:37,236 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: Created Table Plan for vehicletable TS[0]

2020-01-11T11:03:37,236 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: RR before GB vehicletable{(type,type:
string)(make,make: string)(model,model: string)(class,class:
string)(color,color: string)(price,price: double)(count,count:
int)(dealerid,dealerid: decimal(0,0))(delivered,delivered:
timestamp)(block__offset__inside__file,BLOCK__OFFSET__INSIDE__FILE:
bigint)(input__file__name,INPUT__FILE__NAME: string)(row__id,ROW__ID:
struct<transactionid:bigint,bucketid:int,rowid:bigint>)}  after GB
vehicletable{(type,type: string)(make,make: string)(model,model:
string)(class,class: string)(color,color: string)(price,price:
double)(count,count: int)(dealerid,dealerid:
decimal(0,0))(delivered,delivered:
timestamp)(block__offset__inside__file,BLOCK__OFFSET__INSIDE__FILE:
bigint)(input__file__name,INPUT__FILE__NAME: string)(row__id,ROW__ID:
struct<transactionid:bigint,bucketid:int,rowid:bigint>)}

2020-01-11T11:03:37,237 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: tree: (tok_select (tok_selexpr tok_allcolref))
2020-01-11T11:03:37,237 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: genSelectPlan: input = vehicletable{(type,type:
string)(make,make: string)(model,model: string)(class,class:
string)(color,color: string)(price,price: double)(count,count:
int)(dealerid,dealerid: decimal(0,0))(delivered,delivered:
timestamp)(block__offset__inside__file,BLOCK__OFFSET__INSIDE__FILE:
bigint)(input__file__name,INPUT__FILE__NAME: string)(row__id,ROW__ID:
struct<transactionid:bigint,bucketid:int,rowid:bigint>)}  starRr = null
2020-01-11T11:03:37,238 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: Created Select Plan row schema:
vehicletable{(type,_col0: string)(make,_col1: string)(model,_col2:
string)(class,_col3: string)(color,_col4: string)(price,_col5:
double)(count,_col6: int)(dealerid,_col7: decimal(0,0))(delivered,_col8:
timestamp)}
2020-01-11T11:03:37,238 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: Created Select Plan for clause: insclause-0
.....

Note that the lines at 11:03:37,138 and 03:37,202 each show the non-zero
precision and scale values specified in the create table command; that is,
'decimal(8.3)'. But after the SemanticAnalyzer declares that it has
"Created Table Plan for vehicletable TS[0]", precision and scale have both
been change to 0; that is decimal(0,0).


So this is where I'm stumped and I need help from the experts. I can't
figure out where/why the precision and scale are changed to zero. It seems
clear to me that this is what is causing HiveDecimalUtils.validateParameter
to throw the exception, but I just can't figure out why those values get
changed. For anyone who might be interested in more detail, I've included a
more complete snippet of the log file below.

Can anyone help me figure out what may be going on? Any hints or advice on
how to further debug this issue?

Any help/advice will be greatly appreciated.

Thanks,
Brian



--- More Complete Log Output (in case anyone's interested) ---

.....
.....

[snip - at this point, the table has been created successfully]


2020-01-11T11:03:36,923  INFO [main] conf.HiveConf: Using the default value
passed in for log id: 1b1f4d17-52d5-4dba-bbdb-3b126fb332e3
2020-01-11T11:03:36,923  INFO [main] session.SessionState: Updating thread
name to 1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main
2020-01-11T11:03:36,926 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
log.PerfLogger: <PERFLOG method=waitCompile
from=org.apache.hadoop.hive.ql.Driver>
2020-01-11T11:03:36,926 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
lock.CompileLock: Acquired the compile lock.
2020-01-11T11:03:36,926 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
log.PerfLogger: </PERFLOG method=waitCompile start=1578765816926
end=1578765816926 duration=0 from=org.apache.hadoop.hive.ql.Driver>
2020-01-11T11:03:36,926 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
log.PerfLogger: <PERFLOG method=compile
from=org.apache.hadoop.hive.ql.Driver>
2020-01-11T11:03:36,926 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
conf.VariableSubstitution: Substitution is on: select * from vehicletable
2020-01-11T11:03:36,932  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ql.Driver: Compiling
command(queryId=root_20200111110336_324186cc-90c0-4bb1-b155-511a945fe194):
select * from vehicletable
2020-01-11T11:03:36,946 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
log.PerfLogger: <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
2020-01-11T11:03:36,946 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.ParseDriver: Parsing command: select * from vehicletable
2020-01-11T11:03:36,948 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.ParseDriver: Parse Completed
2020-01-11T11:03:36,948 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
log.PerfLogger: </PERFLOG method=parse start=1578765816946
end=1578765816948 duration=2 from=org.apache.hadoop.hive.ql.Driver>
2020-01-11T11:03:36,948 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
log.PerfLogger: <PERFLOG method=semanticAnalyze
from=org.apache.hadoop.hive.ql.Driver>
2020-01-11T11:03:36,948  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
hive.metastore: Mestastore configuration hive.metastore.filter.hook changed
from org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook
2020-01-11T11:03:36,949 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
metadata.Hive: Creating new db. db =
org.apache.hadoop.hive.ql.metadata.Hive@528c8c1, needsRefresh = false,
db.isCurrentUserOwner = true
2020-01-11T11:03:36,949 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
metadata.Hive: Closing current thread's connection to Hive Metastore.
2020-01-11T11:03:36,949  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
hive.metastore: Closed a connection to metastore, current connections: 0
2020-01-11T11:03:36,949 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
metadata.Hive: Closing current thread's connection to Hive Metastore.
2020-01-11T11:03:36,950  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
hive.metastore: HMS client filtering is enabled.
2020-01-11T11:03:36,950  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
hive.metastore: Trying to connect to metastore with URI
thrift://<metastorenode>:9083
2020-01-11T11:03:36,950  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
hive.metastore: Opened a connection to metastore, current connections: 1
2020-01-11T11:03:36,951  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
hive.metastore: Connected to metastore.


2020-01-11T11:03:37,083  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: Starting Semantic Analysis
2020-01-11T11:03:37,087  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: Completed phase 1 of Semantic Analysis
2020-01-11T11:03:37,087  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: Get metadata for source tables
2020-01-11T11:03:37,104  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: Get metadata for subqueries
2020-01-11T11:03:37,104  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: Get metadata for destination tables
2020-01-11T11:03:37,108 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.Client: The ping interval is 60000 ms.
2020-01-11T11:03:37,108 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.Client: Connecting to <metastorenode>/192.168.225.171:8020
2020-01-11T11:03:37,108 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root:
starting, having connections 1
2020-01-11T11:03:37,109 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #8
org.apache.hadoop.hdfs.protocol.ClientProtocol.getServerDefaults
2020-01-11T11:03:37,110 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #8
2020-01-11T11:03:37,110 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: getServerDefaults took 3ms
2020-01-11T11:03:37,124 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #9
org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
2020-01-11T11:03:37,125 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #9
2020-01-11T11:03:37,125 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms
2020-01-11T11:03:37,127 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #10
org.apache.hadoop.hdfs.protocol.ClientProtocol.getEZForPath
2020-01-11T11:03:37,128 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #10
2020-01-11T11:03:37,128 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: getEZForPath took 1ms
2020-01-11T11:03:37,133 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
hdfs.DFSClient:
/tmp/hive/root/1b1f4d17-52d5-4dba-bbdb-3b126fb332e3/hive_2020-01-11_11-03-36_946_7711330475346157458-1:
masked={ masked: rwx------, unmasked: rwx------ }
2020-01-11T11:03:37,133 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #11 org.apache.hadoop.hdfs.protocol.ClientProtocol.mkdirs
2020-01-11T11:03:37,134 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #11
2020-01-11T11:03:37,134 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: mkdirs took 1ms
2020-01-11T11:03:37,134 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #12
org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
2020-01-11T11:03:37,135 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #12
2020-01-11T11:03:37,135 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
2020-01-11T11:03:37,135  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ql.Context: New scratch dir is
hdfs://<metastorenode>:8020/tmp/hive/root/1b1f4d17-52d5-4dba-bbdb-3b126fb332e3/hive_2020-01-11_11-03-36_946_7711330475346157458-1
2020-01-11T11:03:37,135  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: Completed getting MetaData in Semantic Analysis
2020-01-11T11:03:37,138 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
hive.log: DDL: struct vehicletable { string type, string make, string
model, string class, string color, double price, i32 count, decimal(8,3)
dealerid, timestamp delivered}
2020-01-11T11:03:37,202 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
lazy.LazySerDeParameters: oracle.kv.hadoop.hive.table.TableSerDeBase
initialized with: columnNames=[type, make, model, class, color, price,
count, dealerid, delivered] columnTypes=[string, string, string, string,
string, double, int, decimal(8,3), timestamp] separator=[[B@62d40e31]
nullstring=\N lastColumnTakesRest=false timestampFormats=null
2020-01-11T11:03:37,236 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: Created Table Plan for vehicletable TS[0]
2020-01-11T11:03:37,236 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: RR before GB vehicletable{(type,type:
string)(make,make: string)(model,model: string)(class,class:
string)(color,color: string)(price,price: double)(count,count:
int)(dealerid,dealerid: decimal(0,0))(delivered,delivered:
timestamp)(block__offset__inside__file,BLOCK__OFFSET__INSIDE__FILE:
bigint)(input__file__name,INPUT__FILE__NAME: string)(row__id,ROW__ID:
struct<transactionid:bigint,bucketid:int,rowid:bigint>)}  after GB
vehicletable{(type,type: string)(make,make: string)(model,model:
string)(class,class: string)(color,color: string)(price,price:
double)(count,count: int)(dealerid,dealerid:
decimal(0,0))(delivered,delivered:
timestamp)(block__offset__inside__file,BLOCK__OFFSET__INSIDE__FILE:
bigint)(input__file__name,INPUT__FILE__NAME: string)(row__id,ROW__ID:
struct<transactionid:bigint,bucketid:int,rowid:bigint>)}
2020-01-11T11:03:37,237 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: tree: (tok_select (tok_selexpr tok_allcolref))
2020-01-11T11:03:37,237 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: genSelectPlan: input = vehicletable{(type,type:
string)(make,make: string)(model,model: string)(class,class:
string)(color,color: string)(price,price: double)(count,count:
int)(dealerid,dealerid: decimal(0,0))(delivered,delivered:
timestamp)(block__offset__inside__file,BLOCK__OFFSET__INSIDE__FILE:
bigint)(input__file__name,INPUT__FILE__NAME: string)(row__id,ROW__ID:
struct<transactionid:bigint,bucketid:int,rowid:bigint>)}  starRr = null
2020-01-11T11:03:37,238 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: Created Select Plan row schema:
vehicletable{(type,_col0: string)(make,_col1: string)(model,_col2:
string)(class,_col3: string)(color,_col4: string)(price,_col5:
double)(count,_col6: int)(dealerid,_col7: decimal(0,0))(delivered,_col8:
timestamp)}
2020-01-11T11:03:37,238 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
parse.SemanticAnalyzer: Created Select Plan for clause: insclause-0
2020-01-11T11:03:37,239 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ql.Context: Created staging dir =
hdfs://<metastorenode>:8020/tmp/hive/root/1b1f4d17-52d5-4dba-bbdb-3b126fb332e3/hive_2020-01-11_11-03-36_946_7711330475346157458-1/-mr-10000/.hive-staging_hive_2020-01-11_11-03-36_946_7711330475346157458-1
for path =
hdfs://<metastorenode>:8020/tmp/hive/root/1b1f4d17-52d5-4dba-bbdb-3b126fb332e3/hive_2020-01-11_11-03-36_946_7711330475346157458-1/-mr-10000
2020-01-11T11:03:37,240  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
common.FileUtils: Creating directory if it doesn't exist:
hdfs://<metastorenode>:8020/tmp/hive/root/1b1f4d17-52d5-4dba-bbdb-3b126fb332e3/hive_2020-01-11_11-03-36_946_7711330475346157458-1/-mr-10000/.hive-staging_hive_2020-01-11_11-03-36_946_7711330475346157458-1
2020-01-11T11:03:37,240 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #13
org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
2020-01-11T11:03:37,240 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #13
2020-01-11T11:03:37,241 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
2020-01-11T11:03:37,241 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #14
org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
2020-01-11T11:03:37,241 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #14
2020-01-11T11:03:37,241 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: getFileInfo took 0ms
2020-01-11T11:03:37,242 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #15
org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
2020-01-11T11:03:37,242 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #15
2020-01-11T11:03:37,242 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
2020-01-11T11:03:37,243 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #16
org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
2020-01-11T11:03:37,243 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #16
2020-01-11T11:03:37,243 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: getFileInfo took 0ms
2020-01-11T11:03:37,244 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
hdfs.DFSClient:
/tmp/hive/root/1b1f4d17-52d5-4dba-bbdb-3b126fb332e3/hive_2020-01-11_11-03-36_946_7711330475346157458-1/-mr-10000/.hive-staging_hive_2020-01-11_11-03-36_946_7711330475346157458-1:
masked={ masked: rwxr-xr-x, unmasked: rwxrwxrwx }
2020-01-11T11:03:37,244 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #17 org.apache.hadoop.hdfs.protocol.ClientProtocol.mkdirs
2020-01-11T11:03:37,244 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #17
2020-01-11T11:03:37,245 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: mkdirs took 1ms
2020-01-11T11:03:37,245 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #18
org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
2020-01-11T11:03:37,246 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #18
2020-01-11T11:03:37,246 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
2020-01-11T11:03:37,257 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
shims.HdfsUtils:
{-chgrp,-R,supergroup,hdfs://<metastorenode>:8020/tmp/hive/root/1b1f4d17-52d5-4dba-bbdb-3b126fb332e3/hive_2020-01-11_11-03-36_946_7711330475346157458-1/-mr-10000}
2020-01-11T11:03:37,322 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #19
org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
2020-01-11T11:03:37,322 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #19
2020-01-11T11:03:37,322 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
2020-01-11T11:03:37,326 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #20
org.apache.hadoop.hdfs.protocol.ClientProtocol.getListing
2020-01-11T11:03:37,327 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #20
2020-01-11T11:03:37,327 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: getListing took 1ms
2020-01-11T11:03:37,331 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #21
org.apache.hadoop.hdfs.protocol.ClientProtocol.getListing
2020-01-11T11:03:37,332 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #21
2020-01-11T11:03:37,332 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: getListing took 1ms
2020-01-11T11:03:37,332 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
shims.HdfsUtils: Return value is :0
2020-01-11T11:03:37,332 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
shims.HdfsUtils:
{-chmod,-R,700,hdfs://<metastorenode>:8020/tmp/hive/root/1b1f4d17-52d5-4dba-bbdb-3b126fb332e3/hive_2020-01-11_11-03-36_946_7711330475346157458-1/-mr-10000}
2020-01-11T11:03:37,334 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #22
org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
2020-01-11T11:03:37,335 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #22
2020-01-11T11:03:37,335 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
2020-01-11T11:03:37,337 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #23
org.apache.hadoop.hdfs.protocol.ClientProtocol.setPermission
2020-01-11T11:03:37,337 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #23
2020-01-11T11:03:37,337 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: setPermission took 0ms
2020-01-11T11:03:37,339 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #24
org.apache.hadoop.hdfs.protocol.ClientProtocol.getListing
2020-01-11T11:03:37,340 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #24
2020-01-11T11:03:37,340 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: getListing took 1ms
2020-01-11T11:03:37,340 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #25
org.apache.hadoop.hdfs.protocol.ClientProtocol.setPermission
2020-01-11T11:03:37,341 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #25
2020-01-11T11:03:37,341 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: setPermission took 1ms
2020-01-11T11:03:37,341 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #26
org.apache.hadoop.hdfs.protocol.ClientProtocol.getListing
2020-01-11T11:03:37,341 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #26
2020-01-11T11:03:37,341 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: getListing took 0ms
2020-01-11T11:03:37,341 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
shims.HdfsUtils: Return value is :0
2020-01-11T11:03:37,342 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #27
org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
2020-01-11T11:03:37,344 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #27
2020-01-11T11:03:37,344 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms
2020-01-11T11:03:37,361 ERROR [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ql.Driver: FAILED: SemanticException java.lang.IllegalArgumentException:
Decimal precision out of allowed range [1,38]
org.apache.hadoop.hive.ql.parse.SemanticException:
java.lang.IllegalArgumentException: Decimal precision out of allowed range
[1,38]
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genConversionSelectOperator(SemanticAnalyzer.java:7265)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFileSinkPlan(SemanticAnalyzer.java:7041)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPostGroupByBodyPlan(SemanticAnalyzer.java:9625)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:9497)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:10381)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:10259)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genOPTree(SemanticAnalyzer.java:10942)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10953)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10638)
at
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:250)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:603)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1425)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1493)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1339)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1328)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:187)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:409)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:836)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:772)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:699)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:313)
at org.apache.hadoop.util.RunJar.main(RunJar.java:227)
Caused by: java.lang.IllegalArgumentException: Decimal precision out of
allowed range [1,38]
at
org.apache.hadoop.hive.serde2.typeinfo.HiveDecimalUtils.validateParameter(HiveDecimalUtils.java:43)
at
org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.parseType(TypeInfoUtils.java:460)
at
org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.parseTypeInfos(TypeInfoUtils.java:329)
at
org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils.getTypeInfosFromTypeString(TypeInfoUtils.java:818)
at
org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.extractColumnInfo(LazySerDeParameters.java:160)
at
org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.<init>(LazySerDeParameters.java:90)
at
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initialize(LazySimpleSerDe.java:116)
at
org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:58)
at
org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:531)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genConversionSelectOperator(SemanticAnalyzer.java:7262)
... 26 more

2020-01-11T11:03:37,361 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
log.PerfLogger: </PERFLOG method=compile start=1578765816926
end=1578765817361 duration=435 from=org.apache.hadoop.hive.ql.Driver>
2020-01-11T11:03:37,361  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
metadata.Hive: Dumping metastore api call timing information for :
compilation phase
2020-01-11T11:03:37,361 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
metadata.Hive: Total time spent in each metastore function (ms):
{getTable_(String, String, )=6, flushCache_()=132}
2020-01-11T11:03:37,361  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ql.Driver: Completed compiling
command(queryId=root_20200111110336_324186cc-90c0-4bb1-b155-511a945fe194);
Time taken: 0.435 seconds
2020-01-11T11:03:37,362 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
log.PerfLogger: <PERFLOG method=releaseLocks
from=org.apache.hadoop.hive.ql.Driver>
2020-01-11T11:03:37,362 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
log.PerfLogger: </PERFLOG method=releaseLocks start=1578765817362
end=1578765817362 duration=0 from=org.apache.hadoop.hive.ql.Driver>
2020-01-11T11:03:37,362 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ql.Context: Deleting result dir:
hdfs://<metastorenode>:8020/tmp/hive/root/1b1f4d17-52d5-4dba-bbdb-3b126fb332e3/hive_2020-01-11_11-03-36_946_7711330475346157458-1/-mr-10000
2020-01-11T11:03:37,364 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #28 org.apache.hadoop.hdfs.protocol.ClientProtocol.delete
2020-01-11T11:03:37,365 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #28
2020-01-11T11:03:37,365 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: delete took 1ms
2020-01-11T11:03:37,366 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ql.Context: Deleting scratch dir:
hdfs://<metastorenode>:8020/tmp/hive/root/1b1f4d17-52d5-4dba-bbdb-3b126fb332e3/hive_2020-01-11_11-03-36_946_7711330475346157458-1/-mr-10000/.hive-staging_hive_2020-01-11_11-03-36_946_7711330475346157458-1
2020-01-11T11:03:37,367 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #29 org.apache.hadoop.hdfs.protocol.ClientProtocol.delete
2020-01-11T11:03:37,367 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #29
2020-01-11T11:03:37,367 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: delete took 0ms
2020-01-11T11:03:37,368 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ql.Context: Deleting scratch dir:
hdfs://<metastorenode>:8020/tmp/hive/root/1b1f4d17-52d5-4dba-bbdb-3b126fb332e3/hive_2020-01-11_11-03-36_946_7711330475346157458-1
2020-01-11T11:03:37,368 DEBUG [IPC Parameter Sending Thread #0] ipc.Client:
IPC Client (1747053097) connection to <metastorenode>/192.168.225.171:8020
from root sending #30 org.apache.hadoop.hdfs.protocol.ClientProtocol.delete
2020-01-11T11:03:37,368 DEBUG [IPC Client (1747053097) connection to
<metastorenode>/192.168.225.171:8020 from root] ipc.Client: IPC Client
(1747053097) connection to <metastorenode>/192.168.225.171:8020 from root
got value #30
2020-01-11T11:03:37,368 DEBUG [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
ipc.ProtobufRpcEngine: Call: delete took 0ms
2020-01-11T11:03:37,369  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
conf.HiveConf: Using the default value passed in for log id:
1b1f4d17-52d5-4dba-bbdb-3b126fb332e3
2020-01-11T11:03:37,369  INFO [1b1f4d17-52d5-4dba-bbdb-3b126fb332e3 main]
session.SessionState: Resetting thread name to  main

Reply via email to