Alshwarya, Are you running in local mode? If not you probably want to run
hadoop jar ../contrib/streaming/hadoop-0.20.2-streaming.jar -file ~/mapper.sh -mapper ./mapper.sh -input ../foo.txt -output output You may also want to run hadoop fs -ls output/* to see what files were produced. If your mappers failed for some reason then there will be no files in the output directory. And you may want to look at the stderr logs for your processes through the web UI. --Bobby Evans On 10/6/11 3:30 PM, "Aishwarya Venkataraman" <avenk...@cs.ucsd.edu> wrote: I ran the following (I am using IdentityReducer) : ./hadoop jar ../contrib/streaming/hadoop-0.20.2-streaming.jar -file ~/mapper.sh -mapper ~/mapper.sh -input ../foo.txt -output output When I do ./hadoop dfs -cat output/* I do not see any output on screen. Is this how I view the output of mapper ? Thanks, AIshwarya On Thu, Oct 6, 2011 at 12:37 PM, Robert Evans <ev...@yahoo-inc.com> wrote: > A streaming jobs stderr is logged for the task, but its stdout is what is > sent to the reducer. The simplest way to get it is to turn off the > reducers, and then look at the output in HDFS. > > --Bobby Evans > > On 10/6/11 1:16 PM, "Aishwarya Venkataraman" <avenk...@cs.ucsd.edu> wrote: > > Hello, > > I want to view the mapper output for a given hadoop streaming jobs (that > runs a shell script). However I am not able to find this in any log files. > Where should I look for this ? > > Thanks, > Aishwarya > > -- Thanks, Aishwarya Venkataraman avenk...@cs.ucsd.edu Graduate Student | Department of Computer Science University of California, San Diego