Thanks for the reply,
-Steve:
I know that I can use the JobClient to run or submit jobs; however, for the
time being I need to exec the job as a separate process.
-Edward:
The forked job is not executed from witin a map or reduce so I dont need to
do data collection.
It seems for some reason the output of the reduce tasks is not written to
stdout because when I tried to direct the output to a tmp file using the
following command (hadoop jar GraphClean args > tmp), nothing was written to
the file and the output still goes to the screen.
Regards,
Razen
Razen Alharbi wrote:
>
> Hi all,
>
> I am writing an application in which I create a forked process to execute
> a specific Map/Reduce job. The problem is that when I try to read the
> output stream of the forked process I get nothing and when I execute the
> same job manually it starts printing the output I am expecting. For
> clarification I will go through the simple code snippet:
>
>
> Process p = rt.exec("hadoop jar GraphClean args");
> BufferedReader reader = new BufferedReader(new
> InputStreamReader(p.getInputStream()));
> String line = null;
> check = true;
> while(check){
> line = reader.readLine();
> if(line != null){// I know this will not finish it's only for testing.
> System.out.println(line);
> }
> }
>
> If I run this code nothing shows up. But if execute the command (hadoop
> jar GraphClean args) from the command line it works fine. I am using
> hadoop 0.19.0.
>
> Thanks,
>
> Razen
>
>
>
>
--
View this message in context:
http://www.nabble.com/I-need-help-tp23273273p23284528.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.