Re: HDFS Clustering

2015-02-24 Thread Giacomo Licari
Thanks a lot Marton and Max, it worked perfectly. Regards from Italy :) On Tue, Feb 24, 2015 at 11:31 AM, Max Michels wrote: > Hi Giacomo, > > Congratulations on setting up a Flink cluster with HDFS :) To run the > WordCount example provided with Flink, you should first upload your > input file

Re: HDFS Clustering

2015-02-24 Thread Max Michels
Hi Giacomo, Congratulations on setting up a Flink cluster with HDFS :) To run the WordCount example provided with Flink, you should first upload your input file to HDFS. If you have not done so, please run > hdfs dfs -put -p file:///home/user/yourinputfile hdfs:///wc_input Then, you can use the

Re: HDFS Clustering

2015-02-24 Thread Márton Balassi
Hey, Just add the the right prefix pointing to your hdfs filepath: bin/flink run -v flink-java-examples-*-WordCount.jar hdfs://hostname:port/PATH/TO/INPUT hdfs://hostname:port/PATH/TO/OUTPUT Best, Marton On Tue, Feb 24, 2015 at 11:13 AM, Giacomo Licari wrote: > Hi guys, > I'm Giacomo from It

HDFS Clustering

2015-02-24 Thread Giacomo Licari
Hi guys, I'm Giacomo from Italy, I'm newbie with Flink. I setted up a cluster with Hadoop 1.2 and Flink. I would like to ask to you how to run the WordCount example taking the input file from hdfs (example myuser/testWordCount/hamlet. txt) and put the output also inside hdfs (example myuser/testW