When I see MiB, I think million bytes. Is this wrong?

One of the disk manufacturers was taken to court over advertising a device with 
n gigabytes of storage, meaning n*1,000,000,000 bytes. The buyer assumed that a 
Gb was 1K*1K*1K bytes, where 1K was 1024. The court agreed with the plaintiff. 
Now on all disks, AFAIAA, the size of a Gb is spelled out.
--
Peter West
p...@pbw.id.au
“My soul magnifies the Lord…”

> On 15 Aug 2017, at 2:30 am, Michael <keybou...@gmail.com> wrote:
> 
>> On Mon, 14 Aug 2017, Rainer Müller wrote:
>> 
>>> Finder on macOS uses base 10, so "GB" stands for 1000*1000*1000 Bytes. 
>>> du(1) uses base 2, so "G" means 1024*1024*1024 Bytes.
>> 
>> It's for this reason that I've always referred to the base-10 usage as 
>> "marketing MB", because the numbers are bigger.  There is a trend to use 
>> e.g. "MiB" and "GiB" for the real number (amongst us computer freaks who use 
>> base-2).
> 
> It's not marketing. It's very much a real issue.
> 
> Is one computer MB 1000 * 1024? Before you think you know, have you checked 
> floppies?
> When you are dealing with network speeds, and communications, how many bits 
> are in a kilobit? That is something that dates back to telephone signaling, 
> not a recent hard drive marketing thing.
> 
> Yes, it's much easier for computer hardware to use 2^10 instead of 10^3. But 
> as soon as you move away from "I have N wires that I'm pulsing twice for a 
> row and column select", or away from "There are this many bits in a 
> register", and ask yourself "Why do we use these oddities and call them 
> standard prefixes?", can you come up with any answer other than "Because 
> other people who came before me and did not understand the problem used those 
> terms"?
> 
> We now understand the confusion and problem of having two different meanings 
> for the same prefix. So, the newer, inaccurate one got renamed with an "i".



Attachment: signature.asc
Description: Message signed with OpenPGP

Reply via email to