Hi,

I am running into a problem where messages are getting stuck my topic,
leading to KahaDB log files not being deleted and disk running out of space. 

The setup seem to work fine for a week or so. When my durable subscribers go
offline their messages are removed from the data.*.log after the configured
5 minutes. After a week or so I start getting errors similar to the one
below:
2015-10-26 09:12:59,090 | TRACE | not removing data file: 573 as contained
ack(s) refer to referenced file: [572, 573] |
org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal
Checkpoint Worker

Eventually activemq runs out of disk space since these journal files are not
removed. In order to get ActiveMq started again I remove all the files in
kahadb (e.g. db-*.log, db.data, etc ...) kahadb data files and restart
activeMq.


My setup is a follows:
(1) ActiveMq.xml (v5.12.0) with kahadb:

<persistenceAdapter> 
            <kahaDB directory="${activemq.base}/data/kahadb" 
                    ignoreMissingJournalfiles="true" 
                    checkForCorruptJournalFiles="true" 
                    checksumJournalFiles="true" />
</persistenceAdapter>
                
...
<destinationPolicy>
        <policyMap>
          <policyEntries>
                <policyEntry topic="data" expireMessagesPeriod="30000">
                         <deadLetterStrategy>
                           <sharedDeadLetterStrategy processExpired="false" />
                         </deadLetterStrategy>
                </policyEntry>
          </policyEntries>
        </policyMap>
</destinationPolicy>

(2) Durable subscribers configuration in spring web application 
...
   <bean id="amqConnectionFactory"
class="org.apache.activemq.ActiveMQConnectionFactory">
        <property name="transportListener"
ref="myActiveMQExceptionListener"/>
        <property name="brokerURL"
value="failover:(tcp://localhost:61616)?maxReconnectDelay=1000&amp;startupMaxReconnectAttempts=1&amp;jms.prefetchPolicy.all=100&amp;jms.useAsyncSend=true"
/>
        <property name="clientID" value="myClient" />
    </bean>
...

(3) Camel.xml - Camel routes to connect and consume message from a remote
topic

...
  <camelContext xmlns="http://camel.apache.org/schema/spring"; id="camel">
  
        <endpoint id="endpoint" uri="dev:topic:local?timeToLive=300000">
        </endpoint>
        
    <route>
                <from uri="remote:topic:remote_data"/>
                <to ref="endpoint"/>
        </route>
        
  </camelContext>
  

  <bean id="remote"
class="org.apache.activemq.camel.component.ActiveMQComponent">
    <property name="brokerURL" value="failover:tcp://XX.XX.XX.XX:61616"/>
  </bean>
  
  
Has anyone experienced a similar problem? Any help would really be
appreciated.

Thanks in advance,
Steve




--
View this message in context: 
http://activemq.2283324.n4.nabble.com/Message-getting-stuck-on-topic-leading-to-KahaDB-log-files-not-being-deleted-and-disk-running-out-of-tp4703370.html
Sent from the ActiveMQ - User mailing list archive at Nabble.com.

Reply via email to