[ 
https://issues.apache.org/jira/browse/HDFS-1488?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer resolved HDFS-1488.
------------------------------------
    Resolution: Not a Problem

> hadoop will terminate Web service process when a hadoop mapreduce task is 
> finished.
> -----------------------------------------------------------------------------------
>
>                 Key: HDFS-1488
>                 URL: https://issues.apache.org/jira/browse/HDFS-1488
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: namenode
>    Affects Versions: 0.20.2
>         Environment: OS:windows XP + cygwin + hadoop 0.20.2 + myeclipse8.5
>            Reporter: oliverboss
>
> 1. In the myeclipse 8.5 enviroment, I create a new Map/Reduce project named 
> wordcount project!
> 2. create class including "public void main(string[] args)" named Wordcount
> 3.copy the hadoop wordcount exampler code from the hadoop folder to 
> "wordcount project ".
> 4. in the main() method, I add a jetty server and start it .the codes is 
> showed as follows!
> 5.when I build and run it, i find jetty server will be terminate after hadoop 
> task finishs.
> 6.I check hadoop jobtracker logs showing as follows.
> ======================================================
>  logs
> 2010-11-05 16:47:41,968 INFO org.apache.hadoop.ipc.Server: IPC Server 
> listener on 9001: readAndProcess threw exception java.io.IOException: 
> connection was forcibly closed .Count of bytes read: 0
> java.io.IOException: connection was forcibly closed
>       at sun.nio.ch.SocketDispatcher.read0(Native Method)
>       at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:25)
>       at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:233)
>       at sun.nio.ch.IOUtil.read(IOUtil.java:206)
>       at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:236)
>       at org.apache.hadoop.ipc.Server.channelRead(Server.java:1214)
>       at org.apache.hadoop.ipc.Server.access$16(Server.java:1210)
>       at 
> org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:801)
>       at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:419)
>       at org.apache.hadoop.ipc.Server$Listener.run(Server.java:328)
> ==============================================
> codes:
>       public static void main(String[] args) throws Exception {
>               Handler handler = new AbstractHandler() {
>                       @Override
>                       public void handle(String target, HttpServletRequest 
> request,
>                                       HttpServletResponse response, int 
> dispatch)
>                                       throws IOException, ServletException {
>                               // TODO Auto-generated method stub
>                               response.setContentType("text/html");
>                               response.setStatus(HttpServletResponse.SC_OK);
>                               
> response.getWriter().println("<h1>------start-------</h1>");
>                               // ---------------------------------------
>                               // ---------------------------------------
>                               
> response.getWriter().println("<h1>------end1-------</h1>");
>                               ((Request) request).setHandled(true);
>                               // 
> request.getRequestDispatcher("/WebRoot/result.jsp").forward(request,
>                               // response);
>                       }
>               };
>               // 开启Jetty服务
>               Server server = new Server(8086);
>               server.setHandler(handler);
>               server.start();
>               // server.join();
>               
>               
>               SimpleDateFormat tempDate = new SimpleDateFormat("yyyy_MM_dd"
>                               + "_hh_mm_ss");
>               String datetime = tempDate.format(new java.util.Date());
>               String out4 = "out" + datetime;
>               args = new String[] { "in", out4 };
>               Configuration conf = new Configuration();
>               String[] otherArgs = new GenericOptionsParser(conf, args)
>                               .getRemainingArgs();
>               if (otherArgs.length != 2) {
>                       System.err.println("Usage: wordcount <in> <out>");
>                       System.exit(2);
>               }
>               Job job = new Job(conf, "word count");
>               job.setJarByClass(WordCount.class);
>               job.setMapperClass(TokenizerMapper.class);
>               job.setCombinerClass(IntSumReducer.class);
>               job.setReducerClass(IntSumReducer.class);
>               job.setOutputKeyClass(Text.class);
>               job.setOutputValueClass(IntWritable.class);
>               FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
>               FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));
>               System.exit(job.waitForCompletion(true) ? 0 : 1);
>       }



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to