You're the man! When I included the 'statusdir' param I get the following
output in stderr.
Exception in thread "main" java.io.FileNotFoundException:
/tmp/camus_non_avro.properties (No such file or directory)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStrea
It looks like in 0.11 it writes to stderr (limited logging anyway).
Perhaps you can try adding '*statusdir*' param to your REST call and see if
anything useful is written to that directory.
On Mon, Dec 30, 2013 at 2:22 PM, Jonathan Hodges wrote:
> I don't see 'TrivialExecService' output in the
I don't see 'TrivialExecService' output in the jobtracker or tasktracker
logs. We are using hive 0.11 though so maybe not set to DEBUG?
On Mon, Dec 30, 2013 at 2:11 PM, Eugene Koifman wrote:
> Is there any output from TrivialExecService class in any hadoop logs?
> (it's DEBUG level log4j outpu
Is there any output from TrivialExecService class in any hadoop logs?
(it's DEBUG level log4j output in hive 0.12).
It should print the command that TempletonControllerJob's launcher task
(LaunchMapper) is trying to launch
On Mon, Dec 30, 2013 at 12:55 PM, Jonathan Hodges wrote:
> I didn't try
I didn't try that before, but I just did.
curl -s -d user.name=hadoop \
>-d
jar=/tmp/camus-non-avro-consumer-1.0-SNAPSHOT-jar-with-dependencies.jar \
>-d class=com.linkedin.camus.etl.kafka.CamusJob \
>-d arg=-P \
>-d arg=/tmp/camus_non_avro.properties \
>
have you tried adding
-d arg=-P
before
-d arg=/tmp/properites
On Mon, Dec 30, 2013 at 11:14 AM, Jonathan Hodges wrote:
> Sorry accidentally hit send before adding the lines from webhcat.log
>
> DEBUG | 30 Dec 2013 19:08:01,042 | org.apache.hcatalog.templeton.Server |
> queued job job_20131
Sorry accidentally hit send before adding the lines from webhcat.log
DEBUG | 30 Dec 2013 19:08:01,042 | org.apache.hcatalog.templeton.Server |
queued job job_201312212124_0161 in 267 ms
DEBUG | 30 Dec 2013 19:08:38,880 |
org.apache.hcatalog.templeton.tool.HDFSStorage | Couldn't find
/templeton-ha
Hi,
I am trying to kick off a mapreduce job via WebHCat. The following is the
hadoop jar command.
hadoop jar
/home/hadoop/camus-non-avro-consumer-1.0-SNAPSHOT-jar-with-dependencies.jar
com.linkedin.camus.etl.kafka.CamusJob -P
/home/hadoop/camus_non_avro.properties
As you can see there is an app