Thanks, Vinod. We'll try and find this out.
On Sat, Feb 23, 2013 at 3:25 AM, Vinod Kumar Vavilapalli <
vino...@hortonworks.com> wrote:
> If we can enable access logs on the meta store and figure out the last
> query before the one that failed, it should be easy to track it.
>
> I did find a coup
Hi guys,
Thank you very much for pointing me to the right direction. I am glad
that this community is so active.
William
On Sun, Feb 24, 2013 at 4:29 PM, Dean Wampler
wrote:
> Wow! You guys are my new best friends!
>
> Seriously, I'm grateful you've found my participation in the list and the
>
Thank you so much Bejoy,
That was my issue.
Now that I saw the config file I see that I was the one needing a universal
database.
Thanks again,
Regards
Cyril
On Mon, Feb 25, 2013 at 10:47 AM, wrote:
> **
> Hi Cyril
>
> I believe you are using the derby meta store and then it should be an
> iss
Hi Cyril
I believe you are using the derby meta store and then it should be an issue
with the hive configs.
Derby is trying to create a metastore at your current dir from where you are
starting hive. The tables exported by sqoop would be inside HIVE_HOME and hence
you are not able to see the t
I do not get any errors.
It is only when I run hive and try to query the tables I imported. Let's
say I want to only get numeric tuples for a given table. I cannot find the
table (show tables; is empty) unless I go in the hive home folder and run
hive again. I would expect the state of hive to be t
any errors you see ?
On Mon, Feb 25, 2013 at 8:48 PM, Cyril Bogus wrote:
> Hi everyone,
>
> My setup is Hadoop 1.0.4, Hive 0.9.0, Sqoop 1.4.2-hadoop 1.0.0
> Mahout 0.7
>
> I have imported tables from a remote database directly into Hive using
> Sqoop.
>
> Somehow when I try to run Sqoop from Ha
Hi everyone,
My setup is Hadoop 1.0.4, Hive 0.9.0, Sqoop 1.4.2-hadoop 1.0.0
Mahout 0.7
I have imported tables from a remote database directly into Hive using
Sqoop.
Somehow when I try to run Sqoop from Hadoop, the content
Hive is giving me trouble in bookkeeping of where the imported tables are
Is log the file or the directory..
Creating an external table over a directory might me easier, as it would apply
to any log file placed into the folder.
If log is the file try
LOCATION '/a/b/c/d/';
If log is a directory try
LOCATION '/a/b/c/d/log/';
Gr Arthur
From: Abhishek Gayakwad [mailt
do a $HADOOP_HOME/bin/hadoop dfs -ls /a/b/c/d/log
you should see it as a file
also when you give a location in create table statement... make it a
directory
On Mon, Feb 25, 2013 at 6:04 PM, Abhishek Gayakwad wrote:
> I am using Hive 0.9.0, while creating external table
>
> create external tab
> I'll build the trunk and try to test it. Then I'll let you know if that
> helped.
I've just performed the test on current trunk. The problem is still
there, no changes.
Any ideas? Perhaps my config says something interesting? I've added it
to the GIST: https://gist.github.com/zygm0nt/5028591#f
Hi guys,
Does anybody know if the methods getTimestamp(int columnIndex, Calendar cal)
and getTimestamp(String columnName, Calendar cal) are going to be implemented
in a future version? I am grateful that timestamp is now supported in 0.10.0
(thanks to the devs!!!) and for now I will create a wo
On 20/02/13 00:19, Mark Grover wrote:
> Could you try setting mapred.job.tracker property in your
> mapred-site.xml to your jobtracker and see if that fixes it?
I've been setting while in hive with:
set mapred.job.tracker = 'host:port'
but this didn't help. Is setting that property in mapred-si
12 matches
Mail list logo