have you tried adding
-d arg=-P
before
-d arg=/tmp/....properites


On Mon, Dec 30, 2013 at 11:14 AM, Jonathan Hodges <hodg...@gmail.com> wrote:

> Sorry accidentally hit send before adding the lines from webhcat.log
>
> DEBUG | 30 Dec 2013 19:08:01,042 | org.apache.hcatalog.templeton.Server |
> queued job job_201312212124_0161 in 267 ms
>
> DEBUG | 30 Dec 2013 19:08:38,880 |
> org.apache.hcatalog.templeton.tool.HDFSStorage | Couldn't find
> /templeton-hadoop/jobs/job_201312212124_0161/notified: File does not exist:
> /templeton-hadoop/jobs/job_201312212124_0161/notified
>
> DEBUG | 30 Dec 2013 19:08:38,881 |
> org.apache.hcatalog.templeton.tool.HDFSStorage | Couldn't find
> /templeton-hadoop/jobs/job_201312212124_0161/callback: File does not exist:
> /templeton-hadoop/jobs/job_201312212124_0161/callback
>
>
> Any ideas?
>
>
> On Mon, Dec 30, 2013 at 12:13 PM, Jonathan Hodges <hodg...@gmail.com>wrote:
>
>> Hi,
>>
>> I am trying to kick off a mapreduce job via WebHCat.  The following is
>> the hadoop jar command.
>>
>> hadoop jar
>> /home/hadoop/camus-non-avro-consumer-1.0-SNAPSHOT-jar-with-dependencies.jar
>> com.linkedin.camus.etl.kafka.CamusJob -P
>> /home/hadoop/camus_non_avro.properties
>>
>> As you can see there is an application specific parameter '-P' which
>> designates the properties file location.  How do I pass this to WebHCat?
>>
>> Referring to the docs (
>> https://cwiki.apache.org/confluence/display/Hive/WebHCat+Reference+MapReduceJar)
>> I came up with the following.
>>
>> curl -s -d user.name=hadoop \
>>        -d
>> jar=/tmp/camus-non-avro-consumer-1.0-SNAPSHOT-jar-with-dependencies.jar \
>>        -d class=com.linkedin.camus.etl.kafka.CamusJob \
>>        -d arg=/tmp/camus_non_avro.properties \
>>        '
>> http://internal-daalt-hcatalog-1507773817.us-east-1.elb.amazonaws.com/templeton/v1/mapreduce/jar
>> '
>>
>> This command gets the following response from WebHCat
>> {"id":"job_201312212124_0161"}
>>
>> However I only see TempletonControllerJob in the jobtracker UI.  I don't
>> see the Camus jobs that will show up if executed at the command-line.
>>
>> The following are the only things showing in webhcat.log
>>
>>
>> The jar and properties files are in the /tmp directory on HDFS.
>>
>> hadoop fs -ls /tmp
>> -rw-r--r--   2 hadoop supergroup   41456481 2013-12-27 17:45
>> /tmp/camus-non-avro-consumer-1.0-SNAPSHOT-jar-with-dependencies.jar
>> -rw-r--r--   2 hadoop supergroup       2605 2013-12-27 17:45
>> /tmp/camus_non_avro.properties
>>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Reply via email to