Hi Rohan, Thanks for the reply. However I think "Failover Sink processor" is not solving the issue. My problem is how to manage failover for the Source ? Like in this case if server on which Flume Source is present is crashed, Router does not know where to send syslogs. I was think if something like Red Hat CLustering could be used ? What do you think ?
Regards Abhijeet On Wed, Aug 28, 2013 at 12:06 AM, Roshan Naik <[email protected]>wrote: > Take a look at the 'Flume Sink Processor' section in the User Guide. The > subsection 'Failover Sink Processor' would interest you. > -roshan > > > On Tue, Aug 27, 2013 at 2:59 AM, Abhijeet Shipure > <[email protected]>wrote: > >> Hi, >> >> I am working on a hadoop based log archival solution where logs are >> generated by telecom Routers. >> I have very limited knowledge of the routers but I know that routers will >> send syslogs to given IP address and port. >> If we want to use Flume for collecting these syslog, we will run flume >> agent having Syslog UDP as source and running on the same server where >> router will send logs. >> Assuming router can send log messages to only single IP, how can we >> achieve fail-over for flume agent in such cases ? >> >> This is something urgent and any help would be greatly appreciated. >> >> >> Regards >> Abhijeet >> > > > CONFIDENTIALITY NOTICE > NOTICE: This message is intended for the use of the individual or entity > to which it is addressed and may contain information that is confidential, > privileged and exempt from disclosure under applicable law. If the reader > of this message is not the intended recipient, you are hereby notified that > any printing, copying, dissemination, distribution, disclosure or > forwarding of this communication is strictly prohibited. If you have > received this communication in error, please contact the sender immediately > and delete it from your system. Thank You.
