https://bugs.kde.org/show_bug.cgi?id=479204

--- Comment #2 from pallaswept <pallasw...@proton.me> ---
(In reply to Martin Koller from comment #1)
> Yes, using just one thread for this huge files is the bottleneck.
> In my local test backing up a 65GB file took more than 5 hours. Just for a
> cross-check, the "xz" tool alone would also take that long (when not
> explicitely told to use multiple threads).
> 
> The current implementation uses the KDE class KCompressionDevice, which
> seems to  not leverage multiple threads,
> so this needs to be implemented either in kbackup or in some way in the KDE
> classes used.
> 
> I suggest you better don't use compression when having huge files.

Hi Martin,

Thanks so much for looking into this for me! I hope you are having a very Happy
New Year celebration at the moment :)

Thankyou also for putting in such an effort, so much time spent to compress
that file, it is very kind of you. 

Yes, for the time being I think I will disable compression for this profile
with extremely large files, and I will manually compress the resulting file
using some other tool which will use a mutli-threaded approach, afterwards.
Thanks for that advice!

In the longer term, should I be logging a different case against the
KCompressionDevice, or should I leave this case open to track it, or perhaps
something else? Let me know what you would like me to do going forward.

-- 
You are receiving this mail because:
You are watching all bug changes.

Reply via email to