You need to give
1) Map/reduce Master Host : Host where mapred.sh is running.
2)Map/reduce Master port: 19001 (see hadoop-site.xml file)
3)DFS master: Host where your start-dfs.sh is running
4) DFS master port: 19000
These parameters will be sufficient to access HDFS. You may need to setup
some advanced parameters to give permissions to Window's user on hosts
where hadoop is running.
Thanks and regards.
-Rajeev Gupta
Praveen
Yarlagadda
<praveen.yarlagad To
[email protected]> [email protected]
cc
06/18/2009 08:39
AM Subject
Hadoop Eclipse Plugin
Please respond to
core-u...@hadoop.
apache.org
Hi,
I have a problem configuring Hadoop Map/Reduce plugin with Eclipse.
Setup Details:
I have a namenode, a jobtracker and two data nodes, all running on ubuntu.
My set up works fine with example programs. I want to connect to this setup
from eclipse.
namenode - 10.20.104.62 - 54310(port)
jobtracker - 10.20.104.53 - 54311(port)
I run eclipse on a different windows m/c. I want to configure map/reduce
plugin
with eclipse, so that I can access HDFS from windows.
Map/Reduce master
Host - With jobtracker IP, it did not work
Port - With jobtracker port, it did not work
DFS master
Host - With namenode IP, It did not work
Port - With namenode port, it did not work
I tried other combination too by giving namenode details for Map/Reduce
master
and jobtracker details for DFS master. It did not work either.
If anyone has configured plugin with eclipse, please let me know. Even the
pointers
to how to configure it will be highly appreciated.
Thanks,
Praveen