Thanks a lot for Roberto Congiu and wil's help.
The problem has been solved with your assistance.
I think I should read the wiki guide more carefully!
Thank you very much!
Best regards!
2011-03-01
Jianhua Wang
Hi,
You would need to add the files to the distributed cache so other machines can
access it.
http://wiki.apache.org/hadoop/Hive/GettingStarted#STREAMING
http://wiki.apache.org/hadoop/Hive/LanguageManual/Cli#Hive_Resources
hive> add file /home/pc/mypython.py;
hive> select transform(a.col) using
Hi all,
Recently, i have met a problem, and i can not solve it after some
efforts. So I wanna look for help here, and any help will be appreciated.
Thanks!
My case is depicted as below:
I want to execute the HiveQL command :
select transform(a.col) using '/home/pc/mypython.
Hi,
Reading the wiki on dynamic partition, there is best practice example to solve
the issue of creating too many dynamic partitions on a specific node. However,
the query does not work.
(http://wiki.apache.org/hadoop/Hive/Tutorial#Dynamic-partition_Insert)
Is this form of query support?
FR
That was a much easier fix. And it works :)
Though I get an error on hbase shell (ERROR: undefined method
getZooKeeperWrapper)
Viv
On Mon, Feb 28, 2011 at 2:01 PM, John Sichi wrote:
> You should try the latest Hive (either trunk or the 0.7 release branch)
> instead.
>
> JVS
>
> On Feb 28, 2
You should try the latest Hive (either trunk or the 0.7 release branch) instead.
JVS
On Feb 28, 2011, at 5:41 AM, Vivek Krishna wrote:
> In short, I am trying to make hbase_handler to work with hive-0.6 and
> hbase-0.90.1.
>
> I am trying to integrate Hbase and Hive. There is a pretty good
>
Hadoop 20.2 has HADOOP_HEAPSIZE (may be commented) defined in
$HADOOP_CONF_DIR/hadoop-env.sh. $HADOOP_HOME/bin/hadoop uses this shell
variable and constructs the max heap size.
1. Can you change HADOOP_HEAPSIZE and try? Either export on command line or
change hadoop-env.sh.
2. Worst case - run
>
> In short, I am trying to make hbase_handler to work with hive-0.6 and
> hbase-0.90.1.
>
> I am trying to integrate Hbase and Hive. There is a pretty good
> documentation at http://wiki.apache.org/hadoop/Hive/HBaseIntegration .
>
> But looks like they have become old. The hbase_handler was wri
I am also getting this error .. any suggestions?
hive : 0.6
had :0.20.2
=
On Mon, Jun 7, 2010 at 1:03 AM, Shuja Rehman wrote:
> Hi all
> Thanks for reply.
> I have changed the heap size to 1024, then 512 then even 100 in the
> specified file. But i am still getting this error.
> I think