Hi Alex, Thanks for your reply. I changed the port to the actual ip of the machine. I dont know how to resolve this problem.
Thanks much!! sincerely, Jayashree ------------------------------------- # first sink - avro agent1.sinks.avroSink.type = avro agent1.sinks.avroSink.hostname = x.yy.194.180 agent1.sinks.avroSink.port = 41415 agent1.sinks.avroSink.channel = ch1 # second source - avro agent2.sources.avroSource2.type = avro agent2.sources.avroSource2.bind = x.yy.194.180 (not sure if I am allowed to give our IP addresses... so writing x.yy.) agent2.sources.avroSource2.port = 41415 agent2.sources.avroSource2.channel = ch2 And I get this error: 64/bin/../lib/hadoop-1.1.1/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 13/03/29 07:56:23 ERROR avro.AvroCLIClient: Unable to open connection to Flume. Exception follows. org.apache.flume.FlumeException: NettyAvroRpcClient { host: node1, port: 41415 }: RPC connection error at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:117) at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:93) at org.apache.flume.api.NettyAvroRpcClient.configure(NettyAvroRpcClient.java:507) at org.apache.flume.api.RpcClientFactory.getDefaultInstance(RpcClientFactory.java:169) at org.apache.flume.client.avro.AvroCLIClient.run(AvroCLIClient.java:180) at org.apache.flume.client.avro.AvroCLIClient.main(AvroCLIClient.java:71) Caused by: java.io.IOException: Error connecting to node1/x.yy.194.180:41415 at org.apache.avro.ipc.NettyTransceiver.getChannel(NettyTransceiver.java:261) at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:203) at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:152) at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:106) ... 5 more Caused by: java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) On Mon, Mar 25, 2013 at 3:28 AM, Alexander Alten-Lorenz <[email protected] > wrote: > agent1.sinks.avroSink.hostname = 0.0.0.0 > > => Avro sink avroSink: Building RpcClient with hostname: 0.0.0.0, port: > 41415 > > Hostname is the FQHN of the system, not a wildcard ip. > > - Alex > > > On Mar 24, 2013, at 1:41 PM, JR <[email protected]> wrote: > > > Hello, > > > > I am trying to configure a two node work flow. > > > > Avro source ---> mem Channel ----> Avro sink --> (next node) avro source > --> mem channel ---> hdfs sink > > > > #agent1 on node1 > > agent1.sources = avroSource > > agent1.channels = ch1 > > agent1.sinks = avroSink > > > > #agent2 on node2 > > agent2.sources = avroSource2 > > agent2.channels = ch2 > > agent2.sinks = hdfsSink > > > > # first source - avro > > agent1.sources.avroSource.type = avro > > agent1.sources.avroSource.bind = 0.0.0.0 > > agent1.sources.avroSource.port = 41414 > > agent1.sources.avroSource.channels = ch1 > > > > # first sink - avro > > agent1.sinks.avroSink.type = avro > > agent1.sinks.avroSink.hostname = 0.0.0.0 > > agent1.sinks.avroSink.port = 41415 > > agent1.sinks.avroSink.channel = ch1 > > > > # second source - avro > > agent2.sources.avroSource2.type = avro > > agent2.sources.avroSource2.bind = node2 ip > > agent2.sources.avroSource2.port = 41415 > > agent2.sources.avroSource2.channel = ch2 > > > > # second sink - hdfs > > agent2.sinks.hdfsSink.type = hdfs > > agent2.sinks.hdfsSink.channel = ch2 > > agent2.sinks.hdfsSink.hdfs.writeFormat = Text > > agent2.sinks.hdfsSink.hdfs.filePrefix = testing > > agent2.sinks.hdfsSink.hdfs.path = hdfs://node2:9000/flume/ > > > > # channels > > agent1.channels.ch1.type = memory > > agent1.channels.ch1.capacity = 1000 > > agent2.channels.ch2.type = memory > > agent2.channels.ch2.capacity = 1000 > > > > > > Am getting errors with the ports. Could someone please check if I have > connected the sink in node1 to source in node 2 properly? > > > > 13/03/24 04:32:16 INFO source.AvroSource: Starting Avro source > avroSource: { bindAddress: 0.0.0.0, port: 41414 }... > > 13/03/24 04:32:16 INFO instrumentation.MonitoredCounterGroup: Monitoried > counter group for type: SINK, name: avroSink, registered successfully. > > 13/03/24 04:32:16 INFO instrumentation.MonitoredCounterGroup: Component > type: SINK, name: avroSink started > > 13/03/24 04:32:16 INFO sink.AvroSink: Avro sink avroSink: Building > RpcClient with hostname: 0.0.0.0, port: 41415 > > 13/03/24 04:32:16 WARN sink.AvroSink: Unable to create avro client using > hostname: 0.0.0.0, port: 41415 > > org.apache.flume.FlumeException: NettyAvroRpcClient { host: 0.0.0.0, > port: 41415 }: RPC connection error > > at > org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:117) > > at > org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:93) > > at > org.apache.flume.api.NettyAvroRpcClient.configure(NettyAvroRpcClient.java:507) > > at > org.apache.flume.api.RpcClientFactory.getInstance(RpcClientFactory.java:88) > > at > org.apache.flume.sink.AvroSink.createConnection(AvroSink.java:182) > > at org.apache.flume.sink.AvroSink.start(AvroSink.java:242) > > at > org.apache.flume.sink.DefaultSinkProcessor.start(DefaultSinkProcessor.java:46) > > at org.apache.flume.SinkRunner.start(SinkRunner.java:79) > > at > org.apache.flume.lifecycle.LifecycleSupervisor$MonitorRunnable.run(LifecycleSupervisor.java:236) > > at > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:452) > > at > java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:328) > > at > java.util.concurrent.FutureTask.runAndReset(FutureTask.java:161) > > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:109) > > > > > > > > -- > Alexander Alten-Lorenz > http://mapredit.blogspot.com > German Hadoop LinkedIn Group: http://goo.gl/N8pCF > >
