Am Thu, 1 Sep 2016 09:04:53 +0300
schrieb gevisz <gev...@gmail.com>:

> I have bought an external 5TB Western Digital hard drive
> that I am going to use mainly for backing up some files
> in my home directory and carrying a very big files, for
> example a virtual machine image file, from one computer
> to another. This hard drive is preformatted with NTFS.
> Now, I am going to format it with ext4 which probably
> will take a lot of time taking into account that it is
> going to be done via USB connection. So, before formatting
> this hard drive I would like to know if it is still
> advisable to partition big hard drives into smaller
> logical ones.
> 
> For about 20 last years, following an advice of my older
> colleague, I always partitioned all my hard drives into
> the smaller logical ones and do very well know all
> disadvantages of doing so. :)

This has been a bad advice for at least the last 15 years when the last
DOS-based machines died.

The reasoning behind this:

Hard drives are really bad at performance after the first third of
storage space (do a benchmark, transfer speed will almost half).

Next, how do you decide in front how big a partition should be? Your OS
partition will become too small one day or another - your are going to
put big files (swap files, program files) into the other partition. See
previous point: This is the slow one.

By this process, you will now artificially put a big gap into OS
related files - this clearly counterfeits your original intention of
keeping OS files close together.

Most current OSes are good at keeping related files close together
(except maybe Windows after a few Windows Updates runs, but there's
software like MyDefrag to fix this and restore original performance),
or there's technology to mitigate this issue (like bcache in Linux). I
think even Windows has an optimization of allocating swap space nearby
the heads current position, so swap fragmentation isn't even an issue.

The advice which I was always given and refused, since more than 15
years:

        "But if you reinstall, you then don't have to restore all your
        data, and settings, and you can even install your programs to
        the other partition to not loose data and programs..."

Sorry, bu****it. If you expect this to work (at least on Windows, but
that's were the example is from), you will be really disappointed if
you relied on that in case of a disaster: Windows simply stored all
your settings and secret program data files on its C drive - which is
gone. The installed programs are not there or do not work because
Windows simply has no knowledge of them in the other partition after
you reinstall, and even when you manage to start/reinstall them: Their
state is kinda unknown or reset because "ProgramData" is missing. So
this setup is a complete waste of performance and time. And there's no
easy way to fix this. Tricks like symlinking C:\Users to another drive
or use a submount are unsupported and updates will eventually fail to
do this.

I selected Windows here as the example because I expect the advice you
mentioned comes from Windows installations.

Linux, by design, works a lot better here. But still my advice is:
Never ever partition for this reasoning. Even less, if performance is
your concern.

> But what are disadvantages of not partitioning a big
> hard drive into smaller logical ones?

Usually, none. At least for ordinary usage. Performance-wise it's
always a better choice to use multiple physical disks if you need
different partitions. A valid reason for separate partitions (read:
physical drives) is special purpose software like a doctor's office
software which puts all its data and shares below a single directory
structure.

> Is it still advisable to partition a big hard drive
> into smaller logical ones and why?

No. If you want it for logical management, there are much better ways of
achieving this (like fs-integrated pooling, LVM, separate physical
drives selected for their special purpose).

Regarding performance:

I wish Linux had options to relocate files (not just defragment) back
into logical groups for nearby access. Fragmentation is less of a
problem, the bigger problem is data block dislocation over time due to
updates. In Windows, there's the wonderful tool MyDefrag which does
magic and puts your aging Windows installation back into a state of an
almost fresh installation by relocating files to sane positions.

Is there anything similar for Linux?


-- 
Regards,
Kai

Replies to list-only preferred.


Reply via email to