you need hadoop-common-x.x.x.jar and hadoop-hdfs-x.x.x.jar under your flume-ng 
classpath, and the dependent hadoop jar version must match your hadoop system.

if sink to hadoop-2.0.0,  you should use "protobuf-java-2.4.1.jar" (defaultly, 
flume-1.5.0 uses "protobuf-java-2.5.0.jar", the jar file is under flume lib 
directory ), because the pb interface of hdfs-2.0 is compiled wtih 
protobuf-2.4, while using protobuf-2.5 the flume-ng will fail to start....




2014-09-30



shengyi.pan



发件人:Ed Judge <ejud...@gmail.com>
发送时间:2014-09-29 22:38
主题:HDFS sink to a remote HDFS node
收件人:"user@flume.apache.org"<user@flume.apache.org>
抄送:

I am trying to run the flume-ng agent on one node with an HDFS sink pointing to 
an HDFS filesystem on another node.
Is this possible?  What packages/jar files are needed on the flume agent node 
for this to work?  Secondary goal is to install only what is needed on the 
flume-ng node.


# Describe the sink
a1.sinks.k1.type = hdfs
a1.sinks.k1.hdfs.path = hdfs://<remote IP address>/tmp/




Thanks,
Ed

Reply via email to