et : Re: [2.3.8] possible replication issue
Hello Timo,
upgrading both replicators did the job! Both replicators now run v2.3.9
and replication works fine, all sync-jobs which queued up during the
upgrading have been processed successfully.
Thanks for the reassurement and all your great work with do
Hello Timo,
upgrading both replicators did the job! Both replicators now run v2.3.9
and replication works fine, all sync-jobs which queued up during the
upgrading have been processed successfully.
Thanks for the reassurement and all your great work with dovecot,
Andreas
Am 05.12.19 um 13:1
I think there's a good chance that upgrading both will fix it. The bug already
existed in old versions, it just wasn't normally triggered. Since v2.3.8 this
situation is triggered on one dsync side, so the v2.3.9 fix needs to be on the
other side.
> On 5. Dec 2019, at 8.34, Piper Andreas via do
Hello,
upgrading to 2.3.9 unfortunately does *not* solve this issue:
I upgraded one of my replicators from 2.3.7.2 to 2.3.9 and after some
seconds replication stopped. The other replicator remained with 2.3.7.2.
After downgrading to 2.3.7.2 replication is again working fine.
I did not try to
Hi,
I'm dealing for few days with replication issues between two dovecot
containerized instances in Kubernetes environment.
After some search, I found this thread that reports exactly the same symptoms I
encountered : https://dovecot.org/pipermail/dovecot/2019-October/117353.html
I can confirm t
I have the same Problem here.
All systems are running Debian 9 amd64.
My dovecot director servers are running 2.3.8, but the Mailbox Servers having
sync / replication problems with 2.3.8. So i have downgraded the Mailbox
Servers to 2.3.7 and now everything works fine again...
Am 18. Oktober 201
Hi,
some of our customers have discovered a replication issue after
upgraded from 2.3.7.2 to 2.3.8.
Running 2.3.8 several replication connections are hanging until defined
timeout. So after some seconds there are $replication_max_conns hanging
connections.
Other replications are running fast and