Hello Everyone, I am trying to set up a yarn cluster with three nodes (one master and two workers). I followed this tutorial : https://linode.com/docs/databases/hadoop/how-to-install-and-set-up-hadoop-cluster/
I also try to execute the yarn exmaple at the end of this tutorial with the wordcount. After executing the hadoop-mapreduce-examples-2.8.5.jar i get STATUS: FAILED for every Task Id although that the example finished without any error. The failed status means that an error occured on yarn job execution? If so could you explain me what exactly is this error? In the attachement you will find the output that i get to my screen. Thank you in advance, Dimitris Plakas
19/06/06 23:46:20 INFO client.RMProxy: Connecting to ResourceManager at node-master/192.168.0.1:8032 19/06/06 23:46:22 INFO input.FileInputFormat: Total input files to process : 3 19/06/06 23:46:23 INFO mapreduce.JobSubmitter: number of splits:3 19/06/06 23:46:23 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1559847675487_0010 19/06/06 23:46:24 INFO impl.YarnClientImpl: Submitted application application_1559847675487_0010 19/06/06 23:46:24 INFO mapreduce.Job: The url to track the job: http://node-master:8088/proxy/application_1559847675487_0010/ 19/06/06 23:46:24 INFO mapreduce.Job: Running job: job_1559847675487_0010 19/06/06 23:46:38 INFO mapreduce.Job: Job job_1559847675487_0010 running in uber mode : false 19/06/06 23:46:38 INFO mapreduce.Job: map 0% reduce 0% 19/06/06 23:46:48 INFO mapreduce.Job: map 33% reduce 0% 19/06/06 23:46:55 INFO mapreduce.Job: Task Id : attempt_1559847675487_0010_m_000002_0, Status : FAILED 19/06/06 23:46:55 INFO mapreduce.Job: Task Id : attempt_1559847675487_0010_m_000001_0, Status : FAILED 19/06/06 23:47:04 INFO mapreduce.Job: Task Id : attempt_1559847675487_0010_r_000000_0, Status : FAILED 19/06/06 23:47:05 INFO mapreduce.Job: map 67% reduce 0% 19/06/06 23:47:11 INFO mapreduce.Job: Task Id : attempt_1559847675487_0010_m_000001_1, Status : FAILED 19/06/06 23:47:21 INFO mapreduce.Job: Task Id : attempt_1559847675487_0010_r_000000_1, Status : FAILED 19/06/06 23:47:25 INFO mapreduce.Job: Task Id : attempt_1559847675487_0010_m_000001_2, Status : FAILED 19/06/06 23:47:44 INFO mapreduce.Job: map 67% reduce 22% 19/06/06 23:47:45 INFO mapreduce.Job: map 100% reduce 100% 19/06/06 23:47:46 INFO mapreduce.Job: Job job_1559847675487_0010 failed with state FAILED due to: Task failed task_1559847675487_0010_m_000001 Job failed as tasks failed. failedMaps:1 failedReduces:0 19/06/06 23:47:46 INFO mapreduce.Job: Counters: 42 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=1078223 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=394942 HDFS: Number of bytes written=0 HDFS: Number of read operations=6 HDFS: Number of large read operations=0 HDFS: Number of write operations=0 Job Counters Failed map tasks=5 Failed reduce tasks=2 Killed map tasks=1 Killed reduce tasks=1 Launched map tasks=7 Launched reduce tasks=3 Other local map tasks=4 Data-local map tasks=3 Total time spent by all maps in occupied slots (ms)=359388 Total time spent by all reduces in occupied slots (ms)=191720 Total time spent by all map tasks (ms)=89847 Total time spent by all reduce tasks (ms)=47930 Total vcore-milliseconds taken by all map tasks=89847 Total vcore-milliseconds taken by all reduce tasks=47930 Total megabyte-milliseconds taken by all map tasks=46001664 Total megabyte-milliseconds taken by all reduce tasks=24540160 Map-Reduce Framework Map input records=2989 Map output records=7432 Map output bytes=746802 Map output materialized bytes=762467 Input split bytes=240 Combine input records=7432 Combine output records=7283 Spilled Records=7283 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=400 CPU time spent (ms)=5430 Physical memory (bytes) snapshot=554012672 Virtual memory (bytes) snapshot=3930853376 Total committed heap usage (bytes)=408944640 File Input Format Counters Bytes Read=394702
--------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org