Looks like this column is not even there in the 0.8/0.9 schema files . I have
no idea, how I have it in my schema . I just set a default 'false' value and I
m fine now.
Sam
On Jan 4, 2013, at 2:22 PM, Sam William wrote:
> When I upgraded to 0.9.0, Im getting an exception
;t have a default
value
The upgrade script from 0.8 to 0.9 doesnt have anything ? What am I missing ?
Sam William
sa...@stumbleupon.com
Wow .. This works thanks..
Sam
On Jun 20, 2012, at 5:01 PM, Mapred Learn wrote:
> Hi Sam,
> Could you try '\001' instead of '\u0001' ?
>
> Sent from my iPhone
>
> On Jun 20, 2012, at 3:57 PM, Sam William wrote:
>
>>
>>
>> Mark,
>
you add the word external in create table I.e.
>
>
> Create external table(...blah...blah...)
>
> Sent from my iPhone
>
> On Jun 19, 2012, at 4:15 PM, Sam William < sa...@stumbleupon.com > wrote:
>
>
>
>
>
>
> Hi,
> I have a data file tha
terminated by '\u0001' stored as textfile location
'/tmp/myloc';
did not work .
Thanks
Sam William
sa...@stumbleupon.com
query like hive -f hive.hql -hiveconf
>> parameter1=`echo $env-value`
>> and then in the hive.hql file you can access this command line parameter
>> with '${hiveconf:parameter1}'
>>
>> ~Nitin
>>
>> On Thu, May 3, 2012 at 11:53 AM, Sam William wr
est option .
Sam William
sa...@stumbleupon.com
general purpose functions
on top of pure JDK API. Eg: string/date/math functions. Im hoping this is
doable with HIVE-2655 patch ?
Sam William
sa...@stumbleupon.com
Oops. sorry .. . Found multiple repos with hive jars. Thanks
Sam
On Apr 10, 2012, at 12:37 PM, Edward Capriolo wrote:
> Yes hive is in maven.
> Is a great site with a search form:
> http://mvnrepository.com/artifact/org.apache.hive/hive-common
>
> On Tue, Apr 10, 2012 at 3:34
Are hive jars available on any public maven repos ? If not, is there a way to
ask ant to install the built jars to my local ~/.m2/repository ?
Sam William
sa...@stumbleupon.com
> then alias hive command with hive -i /etc/hiverc
>
> On Fri, Apr 6, 2012 at 1:05 AM, Sam William wrote:
> Hi,
> I have this external jar with UDFs . I do not want to everyone in the
> company using these functions to run add jar blah.jar; create temporary
> funct
my options ?
Sam William
sa...@stumbleupon.com
install?
>
> On Thu, Feb 16, 2012 at 11:58 AM, Sam William wrote:
> We recently upgraded to Hive-0.8.0 and noticed that all queries fail when
> run as hive -e with the error ,
>
>
> sampd@face:~$ hive -e "show tables;"
>
> FAILED: Parse Error: line
f option works though . Has anyone else faced this ?
Sam William
sa...@stumbleupon.com
I was able to get across this .
The solution is to use thrift-0.6.0 with the following patch
https://issues.apache.org/jira/browse/THRIFT-1060
Sam
On Feb 8, 2012, at 5:42 PM, Sam William wrote:
> Hi,
> Im trying to build the HiveODBC driver. The hive source code base I'
3:
error: reference to ‘eventHandler_’ is ambiguous
...
I tried with a couple of versions of Thrift , 0.9.0-dev and 0.5.0 . Neither
of them proved to be good.Has it got to do with the thrift library version
? Whats the fix.Any help is appreciated.
Thanks,
Sam W
,
Sam
On Jan 31, 2012, at 11:50 AM, Sam William wrote:
>
> I have a new Hive installation . Im able to create tables and do select *
> queries from them. But as soon as I try to execute a query that would
> involve a Hadoop M/R job, I get this exception .
>
>
>
>
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807)
The table is pretty simple . It is an external table on the HDFS and does
not have any partitions. Any idea why this could be happening ?
Thanks,
Sam William
sa
get this error .
Failed with exception java.io.IOException:java.io.IOException: No LZO codec
found, cannot run.
What am I missing? Any help is appreciated.
Thanks,
Sam William
sa...@stumbleupon.com
he inbuilt functions. What options do I
have other than modifying FunctionRegistry and recompiling ?
Sam William
sa...@stumbleupon.com
; ~Abhishek
>
>
>
>
> On Thu, Dec 1, 2011 at 10:21 AM, sonia gehlot < sonia.geh...@gmail.com >
> wrote:
>
>
> Hi All,
>
> I have Unix timestamp in my table in UTC format. Is there is any inbuilt
> function to convert it into PST or PDT in YYY
I have all my slave nodes on PDT timezone. However when I run this query ,
select from_unixtime(unix_timestamp()) from dual; (dual is a one row table
that I created).
I get the date/time in UTC.What do I do get the PDT time ( i dont want to
write a udf for this ) .
Sam William
now I wannt to load all files
> under /app in one table. Is there any idea?
>
> R
Sam William
sa...@stumbleupon.com
> Vaibhav
>
> -Original Message-
> From: Sam William [mailto:sa...@stumbleupon.com]
> Sent: Monday, August 29, 2011 3:56 PM
> To: user@hive.apache.org
> Subject: HIVE_AUX_JARS_PATH
>
> I assume you need to set HIVE_AUX_JARS_PATH to the jars that contain
> custom
classes . I get a class not found error . What am I
missing here ?
Sam William
sa...@stumbleupon.com
n/path/to/subdir
>
> Also, it seems to ignore directories prefixed by an underscore (_directory).
>
> I am using hive 0.7.1 on Hadoop 0.20.2.
>
> Is there a way to force Hive to ignore all subdirectories in external tables
> and only look at files?
>
> Thanks in advance,
> -Dave
>
Sam William
sa...@stumbleupon.com
gt; when trying to do a SELECT on the table:
>
> Failed with exception java.io.IOException:java.io.IOException: Not a file:
> hdfs://path/to/partition/path/to/subdir
>
> Also, it seems to ignore directories prefixed by an underscore (_directory).
>
> I am using hive 0.7.1 on Hadoop 0.20.2.
>
> Is there a way to force Hive to ignore all subdirectories in external tables
> and only look at files?
>
> Thanks in advance,
> -Dave
>
Sam William
sa...@stumbleupon.com
query on the table , I get an exception .
Failed with exception java.io.IOException:java.io.IOException: Not a file:.
This is inspite of setting mapred.input.dir.recursive=true;
Is this a supported feature in Hive ? Any alternatives ?
Sam William
sa...@stumbleupon.com
28 matches
Mail list logo