Hello all,
I am trying for sqoop import from SQL Server into Hive.
When I execute sqoop-import command, the problem is that import task gets
completed,
I can see the complete data on HDFS (under
/user/hive/warehouse/table_name_dir)
but when I execute "SHOW TABLES " command in HIve CLI I am not abl
Btw, i tried it on CDH3 hive.
On Feb 1, 2012 10:02 PM, "Chris Shain" wrote:
> I've tried it. It seems to work fine, but with ODBC, you still need to
> send SQL commands to the server, and Hive SQL is incomplete and non-ansi
> compliant in many ways. This means that an application that uses ANSI S
I've tried it. It seems to work fine, but with ODBC, you still need to send
SQL commands to the server, and Hive SQL is incomplete and non-ansi
compliant in many ways. This means that an application that uses ANSI SQL
will not always generate Hive friendly queries.
They do have an excel connector
I haven't spent much time working with their ODBC driver, but I've had no
problems with it on our CDH3u2 Hive server so far.
Functionality with SSIS and BIDS is my primary concern, so we can cut out FTP
operations on both ends.
On Feb 1, 2012, at 9:41 PM, "John Omernik"
mailto:j...@omernik.c
Any reason you want to use a ODBC and not Thrift ? Hive supports the
thrift protocol. There are thrift libraries for C# and you can easily
integrate it into your project for direct access to HIVE via your C# code.
On Wed, Feb 1, 2012 at 6:40 PM, John Omernik wrote:
> I see that, but will that h
I see that, but will that hive ODBC driver work with a standard hive
install, or will it be limited to Microsoft's cloud version of Hadoop/Hive?
Anyone tried the driver?
On Wed, Feb 1, 2012 at 4:23 PM, Tucker, Matt wrote:
> The Hive driver that Microsoft will be releasing is ODBC, so you should
Is it possible to escape '.' in get_json_object?
For example: {"a.b": "test"}
get_json_object(json, '$.a.b') will return NULL because it's looking for a
nested object.
Something like this would be nice: get_json_object(json, '$.a\\.b')
Beyond changing how the json object is keyed, anythin
I have resolved this, so I ll share what the issue was,
I had set HIVE_AUX_JARS_PATH in my hive-env.sh
as
HIVE_AUX_JARS_PATH=$HIVE_AUX_JARS_PATH,$HIVE_HOME/lib/jar1.jar,$HIVE_HOME/lib/jar2.jar,$HIVE_HOME/lib/jar3.jar.
The empty HIVE_AUX_JARS_PATH was causing the exception.
The following
The Hive driver that Microsoft will be releasing is ODBC, so you should be able
to interact with Hive just like you would with any other relational database.
From: John Omernik [mailto:j...@omernik.com]
Sent: Wednesday, February 01, 2012 3:22 PM
To: user@hive.apache.org
Subject: Hive ODBC - Micro
thanks Mark,
I ended up going the custom reducer way. I will try out the query you have
sent.
Regards,
--
Rohan Monga
On Wed, Feb 1, 2012 at 11:06 AM, Mark Grover wrote:
> Rohan,
> You could do it one of the following ways:
> 1) Write a UDAF that does the avg(f2 - avg_f2) computation.
> 2) Wri
Does anyone know if the driver Microsoft is talking about with their Azure
based hadoop/hive setup would work for connecting Windows applications
(Excel/.NET Web Apps etc) to Apache Hive running on Unix? Looking for a
way to connect .NET Web apps to Hive for some process flow upgrades.
Thanks!
Rohan,
You could do it one of the following ways:
1) Write a UDAF that does the avg(f2 - avg_f2) computation.
2) Write a custom reducer that does the avg(f2 - avg_f2) computation.
3) Do it with multiple passes over the data. Something like this (untested):
select
table.f1,
avg_table.avg_f2,
Another note:
Once the job starts, the max slots are assigned and filled. As maps
complete, they are not reassigned. This continues until hitting the bottom
at 1 map per node.
Manually setting mapred.max.maps.per.node high is not changing this
behavior.
Thanks,
Clint
From: Clint
We only allow the anonymous connection via JDBC from a specific host designated
to run application jobs against Hive.
For end user connections, we use Cloudera’s Hue interface which has specific
user ID/password level authentication. It’s not a bad web based client tool at
all. Not as robust
Yes, you are right! Toad (at least my version) does not give an option to
specific user/password.
Is there any other way (may be using other SQL editors) that we can enforce
user id/password on Hive? Also how do we set up users on hive?
Regards,
Shantian
Fro
Andrew,
This might come in handy:
http://www.congiu.com/node/7
Mark Grover, Business Intelligence Analyst
OANDA Corporation
www: oanda.com www: fxtrade.com
e: mgro...@oanda.com
"Best Trading Platform" - World Finance's Forex Awards 2009.
"The One to Watch" - Treasury Today's Adam Smith Award
Hive list,
I am facing a unique situation here where using hive (0.7.1) with a remote,
external metastore (pgsql) is limiting the number of map slots per node to 1
(out of 25 slots available). Other map jobs are successfully utilizing all
available slots, just hive jobs are limited across the
Hi,
I installed hadoop 0.23.0 which can work.
The version of my hive is 0.8.1. The query like ‘select * from tablename’
can work. But an exception is thrown when executing query like ‘select col1
form tablename’.
2012-02-01 16:32:20,296 WARN mapreduce.JobSubmitter
(JobSubmitter.java:copyA
How much namenode handler (dfs.namenode.handler.count) you have defined for
your cluster?
- Alex
--
Alexander Lorenz
http://mapredit.blogspot.com
On Feb 1, 2012, at 12:25 PM, Xiaobin She wrote:
>
> hi Alex,
>
> I'm using jre 1.6.0_24
>
> with hadoop 0.20.0
> hive 0.80
>
> thx
>
>
> 2012/
How much namenode handler (dfs.namenode.handler.count) you have defined for
your cluster?
- Alex
--
Alexander Lorenz
http://mapredit.blogspot.com
On Feb 1, 2012, at 12:25 PM, Xiaobin She wrote:
>
> hi Alex,
>
> I'm using jre 1.6.0_24
>
> with hadoop 0.20.0
> hive 0.80
>
> thx
>
>
> 2012/
hi Alex,
I'm using jre 1.6.0_24
with hadoop 0.20.0
hive 0.80
thx
2012/2/1 alo alt
> Hi,
>
> + hdfs-user (bcc'd)
>
> which jre version u use?
>
> - Alex
>
> --
> Alexander Lorenz
> http://mapredit.blogspot.com
>
> On Feb 1, 2012, at 8:16 AM, Xiaobin She wrote:
>
> > hi ,
> >
> >
> > I'm using
Hi,
+ hdfs-user (bcc'd)
which jre version u use?
- Alex
--
Alexander Lorenz
http://mapredit.blogspot.com
On Feb 1, 2012, at 8:16 AM, Xiaobin She wrote:
> hi ,
>
>
> I'm using hive to do some log analysis, and I have encountered a problem.
>
> My cluster have 3 nodes, one for NameNode/Job
22 matches
Mail list logo