Thanks venkatesh .I already have this config in place.Flume is writing data
in hdfs till some point.
After some random point , it stops writing , I dont see any *.tmp file
created in hdfs but still flume agent is running.
I am not sure why i stops writing data..but still logfile produces data
co
Hey Sarvana I think you seem to be at very basic level of start.
Just try with the following configuration,
channel_capacity=1000
sink_rollInterval=300
sink_rollCount=0
sink_rollSize=0
sink_batchSize=100
channel_transactionCapacity=1000
And the file which you are tailing it should add new data c
No i am using flume provided source exec , just to tail -F
I start flume as below,
/bin/flume-ng agent -c /d0/flume/conf -f
/d0/flume/conf/flume-conf.properties -n a1
-Dflume.root.logger=DEBUG,LOGFILE &
I get logs in flume.log file.I could see flume agent running from ps.
I use file/memory cha
Hi sarvana,
My flume agent is surely running, when I see in running processes ps -ef it
is showing flume agent process is running.But my custom source is not
running.
I have implemented logging, if flume custom source is running then it will
write to log files. I have written many log statements I
Hi,
My issue looks little similar to your.My exec source stops collecting data
after some time.
But I like to know how you could say flume is running but ,your custom
source is not running.Let me know how you identify that.
Because in running process "ps" i could see a single process i.e. flume
Hello,
I'm running flume in Cent os 6.3 version.
Apache flume version 1.3
I had written many FLUME CUSTOM SOURCES,
All sources are working fine, except two sources.
When I start flume i.e., flume-ng .. command from shell it is working
fine, flume is collecting data continuously for 6hours I