On Tue, May 28, 2013 at 7:59 AM, Peter Marron <
peter.mar...@trilliumsoftware.com> wrote:

>  Hi,****
>
> ** **
>
> Hive 0.10.0 over Hadoop 1.0.4.****
>
> ** **
>
> Further to my filtering questions of before.****
>
> I would like to be able to access the table properties from inside my
> custom InputFormat.****
>
> I’ve done searches and there seem to be some other people who have had a
> similar problem.****
>
> The closest I can see to a solution is to use ****
>
>                 MapredWork mrwork = Utilities.getMapRedWork(configuration);
> ****
>
> but this fails for me with the error below.****
>
> I’m not truly surprised because I and trying to make sure that my query***
> *
>
> runs without a map/reduce and some of the e-mails suggest that in this
> case:****
>
> ** **
>
> “…no mapred job is
> run, so this trick doesn't work (and instead, the Configuration object
> can be used, since it's local).”****
>
> ** **
>
> Any pointers would be very much appreciated.
>

Yeah, as you discovered, that only works in the MapReduce case and breaks
on cases like "select count(*)" that don't run in MapReduce.

I haven't tried it, but it looks like the best you can do with the current
interface is to implement a SerDe which is passed the table properties in
initialize. In terms of passing it to the InputFormat, I'd try a thread
local variable. It looks like the getRecordReader is called soon after the
serde.initialize although I didn't do a very deep search of the code.

-- Owen

Reply via email to