Are the permissions on the files the same? Does the user running the flume 
agents have read permissions? 
Are the files still being written to/locked open by another process?
Are there any logs being generated by the flume agent?

-- 
Chris Horrocks 

On 20 April 2016 at 08:00:14, Saurabh Sharma 
(saurabh.sha...@nviz.com(mailto:saurabh.sha...@nviz.com)) wrote:

> 
> 
> Hi Ron,
> 
> 
> 
> 
> 
> The maximum number of open files in our OS is 1024.
> 
> 
> Thanks
> 
> 
> 
> 
> 
> From: Ronald Van De Kuil [mailto:ronald.van.de.k...@gmail.com]
> Sent: 20 April 2016 12:24
> To: user@flume.apache.org
> Subject: Re: Flume not marking log files as completed and do not process file 
> further
> 
> 
> 
> 
> 
> 
> 
> Not sure that this helps, ... have you checked your operating system settings 
> for the maximum number of open files? 
> 
> Met vriendelijke groet, 
> Ronald van de Kuil 
> 
> 
> 
> Op 20 apr. 2016 om 08:48 heeft Saurabh Sharma 
> <saurabh.sha...@nviz.com(mailto:saurabh.sha...@nviz.com)> het volgende 
> geschreven:
> 
> 
> > 
> > Hi,
> > 
> > 
> > 
> > 
> > 
> > I have a scenario where we are ingesting the log files(around 80MB each) in 
> > flume and flume does process these files and marks as completed but after 
> > processing few files it does not process any files further and moreover it 
> > does not mark the log file as completed.
> > 
> > 
> > 
> > 
> > 
> > We are using spooling directory source.
> > 
> > 
> > 
> > 
> > 
> > I looked into the flume logs and found that when this scenario happens it 
> > shows the following line continuously in flume logs.
> > 
> > 
> > 
> > 
> > 
> > DEBUG [conf-file-poller-0] 
> > (org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run:126)
> >  - Checking file:../conf/commerce-sense.conf for changes
> > 
> > 
> > 
> > 
> > 
> > We have following configuration.
> > 
> > 
> > agent.sources = spoolDir
> > 
> > 
> > agent.channels = memoryChannel
> > 
> > 
> > agent.sinks = sink
> > 
> > 
> > agent.sources.spoolDir.interceptors = i1
> > 
> > 
> > 
> > 
> > 
> > 
> > 
> > 
> > #Channel Configuration
> > 
> > 
> > agent.channels.memoryChannel.type = memory
> > 
> > 
> > 
> > 
> > 
> > #Source configuration
> > 
> > 
> > agent.sources.spoolDir.type = spooldir
> > 
> > 
> > agent.sources.spoolDir.spoolDir = /opt/flume/spoolDir
> > 
> > 
> > agent.sources.spoolDir.fileHeader = true
> > 
> > 
> > agent.sources.spoolDir.basenameHeader = true
> > 
> > 
> > agent.sources.spoolDir.deserializer = LINE
> > 
> > 
> > agent.sources.spoolDir.inputCharset = ISO8859-1
> > 
> > 
> > agent.sources.spoolDir.deserializer.maxLineLength = 10000
> > 
> > 
> > agent.sources.spoolDir.interceptors.i1.type = 
> > org.apache.flume.sink.solr.morphline.UUIDInterceptor$Builder
> > 
> > 
> > agent.sources.spoolDir.interceptors.i1.preserveExisting = true
> > 
> > 
> > agent.sources.spoolDir.interceptors.i1.prefix = test
> > 
> > 
> > agent.sources.spoolDir.channels = memoryChannel
> > 
> > 
> > 
> > 
> > 
> > 
> > 
> > 
> > #Sink Configuration
> > 
> > 
> > agent.sinks.sink.type = com. flume.sink.ExtendedKafkaSink
> > 
> > 
> > agent.sinks.sink.topic = cdnLogsTopic
> > 
> > 
> > agent.sinks.sink.brokerList = localhost:9092
> > 
> > 
> > agent.sinks.sink.batchSize = 100
> > 
> > 
> > agent.sinks.sink.sink.serializer = com. 
> > flume.serializer.ExtendedSerializer$Builder
> > 
> > 
> > agent.sinks.sink.channel = memoryChannel
> > 
> > 
> > 
> > 
> > 
> > Thanks,
> > Saurabh
> > 
> > 
> 
> 
> 
> 


Reply via email to