Using standardized SI prefixes

Bjørn Ingmar Berg ingmar at bergcube.net
Wed Jun 13 13:19:21 UTC 2007


On 13/06/07, Christof Krüger <ubuntu at pop2wap.net> wrote:
> I'd really like to hear some real arguments against SI prefixes, besides
> being ugly or funny to pronounce or just because "it has always been
> like that". Advantages of using SI prefixes has been mentioned in this
> thread. Please tell me the disadvantages so there can actually be a
> constructive discussion.

So far in this discussion i honestly thought that the arguments
against SI prefixes were too obvious to bother mentioning.

Let me start with a dumb example:
For a child or uninterested commoner that flying critter is simply "a
birdie".  For those in the know exactly the same entity is a "Falco
peregrinus".
Even if simply calling it "birdie" or perhaps "falcon" would be
easier, more "user friendly" more "understandable for everyone" it
simply would not be /correct/.
Therefore it must stay "Falco peregrinus" in all contexts where really
conveying information matters.

Computers deal with numbers in base two.  Humans deal with numbers in
base 10.  When computers and humans interact (on a technical level)
humans must adapt to the computer, because computers can not.
Dealing with chunks of data, addresses, registers, etc. has to be done
in base 2.  Even if 1024 is "close enough" to 10^3 for a PHB or
marketing humanoid, that will never make those two numbers equal.  And
it must never be allowed to.  Computers, computer designers, computer
technicians and most computer programmers will always deal with the
_real_ base 2 numbers like 1024.

Another example.  Pi is an irrational number starting with 3.14....
Sure, it would be easier to "standardize" it to 3.00.  Done deal.  It
would be easier to remember and more marketable.  It would also be
totally useless AND completely wrong.  AFAIK some very dumb people
actually managed to decree by law that pi was to equal 3.  They had to
stop doing that.

In the same was as with pi redefining or "standardizing" kilobytes and
megabytes would be totally useless AND completely wrong.  Computers
have always, do, and will continue to deal with their numbers along
the progression of 2, 4, 8, 16, 32, 64, 128, 256, 512, 1024, etc...
So, when dealing with computers, must we.

One does not redefine "Falco peregrinus" to "birdie" because "that
would make it more understandable for the commoner".  Ornithologists
need it to stay "Falco peregrinus" in the future.
One does not redefine pi to a value of 3 because "that would make it
more understandable for the commoner".  Mathematicians, architects
(and basically everyone else) need it to stay ~3.1415926535 in the
future.
One does not redefine kilobyte to mean 1000 (base 10) because "that
would make it more understandable for the commoner".  Real computer
people need it to stay 1024 (base 10).

A well-known and very common trait of language is that one given word
can often have more than one specific meaning.  When this is the case
you need a context to be sure.  This is considered normal, and never a
real problem.  This should hold true regarding computers and counting
as well.

Finally a personal and subjective thought.  At times one has to chose
whether to oversimplify facts and information to the point where
"everyone" understands it, (If this happens they DO NOT understand it;
they are given the illusion of understanding) or whether to educate
the public.  I am very convinced the correct solution is always to
educate the public.  The world is not flat.  The earth is not the
center of the universe.  Pi is not 3.  A kilobyte is not 1000; it is
1024 because that is the way computers work.


Regards,
Bjørn Ingmar Berg

-- 

blog.bergcube.net/




More information about the Ubuntu-devel-discuss mailing list