[ceph-users] Re: Per-Client Quality of Service settings

2025-01-10 Thread Olaf Seibert
arate bucket, with separate reservation/limitation/weight values. But so far as I understand the text it doesn't work like that. I would love to be proven wrong here :-) -- Olaf Seibert Site Reliability Engineer SysEleven GmbH Boxhagener Straße 80 10245 Berlin T +49 30 233 2012 0 F +49

[ceph-users] Per-Client Quality of Service settings

2025-01-10 Thread Olaf Seibert
but which is about balancing individual clients across all services? -- Olaf Seibert Site Reliability Engineer SysEleven GmbH Boxhagener Straße 80 10245 Berlin T +49 30 233 2012 0 F +49 30 616 7555 0 https://www.syseleven.de https://www.linkedin.com/company/syseleven-gmbh/ Current

[ceph-users] Re: RGW sync gets stuck every day

2024-09-11 Thread Olaf Seibert
- latency=1309.589843750s Sep 09 10:24:43 ham1-000215 radosgw[2104175]: RGW-SYNC:meta: ERROR: failed to fetch all metadata keys Sep 09 10:24:43 ham1-000215 radosgw[2104175]: rgw rados thread: ERROR: failed to run sync Sep 09 10:24:43 ham1-000215 radosgw[2104175]: final shutdown On 08.08.24 16:2

[ceph-users] Re: RGW sync gets stuck every day

2024-08-08 Thread Olaf Seibert
ackup) syncing full sync: 0/128 shards incremental sync: 128/128 shards 3 shards are recovering recovering shards: [30,36,39] After that it took another hour or so until the rec

[ceph-users] RGW sync gets stuck every day

2024-08-06 Thread Olaf Seibert
s about how we can find out what is causing this? It may be that some customer has some job running every 24 hours, but that shouldn't cause the replication to get stuck. Thanks in advance, -- Olaf Seibert Site Reliability Engineer SysEleven GmbH Boxhagener Straße 80 10245 Berlin T +49

[ceph-users] RGW sync gets stuck every day

2024-08-06 Thread Olaf Seibert
dn't cause the replication to get stuck. Thanks in advance, -- Olaf Seibert Site Reliability Engineer SysEleven GmbH Boxhagener Straße 80 10245 Berlin T +49 30 233 2012 0 F +49 30 616 7555 0 https://www.syseleven.de https://www.linkedin.com/company/syseleven-gmbh/ Current system status always at