Thanks everybody,
The issue was that hadoop writes all the outputs to stderr instead of stdout
and i don't know why. I would really love to know why the usual hadoop job
progress is written to stderr.
Thanks again.
Razen
Razen Alharbi wrote:
>
> Hi all,
>
> I am writing an application in which I create a forked process to execute
> a specific Map/Reduce job. The problem is that when I try to read the
> output stream of the forked process I get nothing and when I execute the
> same job manually it starts printing the output I am expecting. For
> clarification I will go through the simple code snippet:
>
>
> Process p = rt.exec("hadoop jar GraphClean args");
> BufferedReader reader = new BufferedReader(new
> InputStreamReader(p.getInputStream()));
> String line = null;
> check = true;
> while(check){
> line = reader.readLine();
> if(line != null){// I know this will not finish it's only for testing.
> System.out.println(line);
> }
> }
>
> If I run this code nothing shows up. But if execute the command (hadoop
> jar GraphClean args) from the command line it works fine. I am using
> hadoop 0.19.0.
>
> Thanks,
>
> Razen
>
>
>
>
--
View this message in context:
http://www.nabble.com/I-need-help-tp23273273p23307094.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.