Hi Folks
Any suggestions or thoughts on the question / issue posted below ?
Regards
Srinivas
On 2018/09/19 10:47:38, Srinivas M wrote:
> Hi>
>
> We have a java application which writes parquet files. We are using the>
> Parquet 1.9.0 API to write the Timestamp dat
Hi
We have a java application which writes parquet files. We are using the
Parquet 1.9.0 API to write the Timestamp data. Since there are
incompatibilities between the Parquet and Hive representation of the
Timestamp data, we have tried to work around the same by writing the
Parquet Timestamp data
l your efforts in fixing
> > the existing jdbc driver, so that you don't end up duplicating a lot
> > of work.
> >
> >
> > Regarding SQL standards compliance, hive 1.2.1 has made lot of
> > progress in regarding that aspect. Just like any other SQL database,
>
ferred API for users. There are many features
> implemented in those layers, including the security and also high
> availability features.
> Incorrect use of the thrift api can potentially lead to other issues
> like memory leaks in HiveServer2.
>
>
>
>
>
>
> On Wed, D
Hi
I am trying to develop a custom application using the Thrift Hive client
interface to access the Hive and read and write into the hive tables. ODBC
and JDBC are not option because of the inherent limitations with those
interfaces (i.e unable to generate the HiveQL etc).
While using the Thrift