Forgot your password?
typodupeerror
Ubuntu Data Storage Linux News

Ubuntu Will Switch To Base-10 File Size Units In Future Release 984

Posted by Soulskill
from the stay-above-the-belt dept.
CyberDragon777 writes "Ubuntu's future 10.10 operating system is going to make a small, but contentious change to how file sizes are represented. Like most other operating systems using binary prefixes, Ubuntu currently represents 1 kB (kilobyte) as 1024 bytes (base-2). But starting with 10.10, a switch to SI prefixes (base-10) will denote 1 kB as 1000 bytes, 1 MB as 1000 kB, 1 GB as 1000 MB, and so on."
This discussion has been archived. No new comments can be posted.

Ubuntu Will Switch To Base-10 File Size Units In Future Release

Comments Filter:
  • Thing is (Score:4, Insightful)

    by davidjgraph (1713990) * on Saturday March 27, 2010 @11:57AM (#31639948)
    Anyone who's too stupid to understand the difference, isn't going to care. Someone, somewhere, has too much time on their hands...
  • by the unbeliever (201915) <chris+slashdot&atlgeek,com> on Saturday March 27, 2010 @11:58AM (#31639954) Homepage

    Apple did this with Snow Leopard, which makes me a cranky geek.

    Why can't the OS manufacturers pressure the hard drive companies to market their sizes correctly? =(

  • Annoying... (Score:2, Insightful)

    by anss123 (985305) on Saturday March 27, 2010 @12:00PM (#31639970)
    Human language is context based; meaning the exact meaning of words depends on in which context they are used. Why should it be different for prefixes? Just so a few morons won't be confused? Pah... morons being morons will just find something else to be confused about.
  • by Shinobi (19308) on Saturday March 27, 2010 @12:02PM (#31639992)

    HD manufacturers are presenting the sizes correctly. SI prefix = hard-defined base-10, it's just computer engineering and computer sciences that broke the established standard.

  • by bigtomrodney (993427) * on Saturday March 27, 2010 @12:03PM (#31640010)
    I think you've misunderstood the issue. The problem is that the kilo, mega, giga etc. are base-10 orders of magnitude that were used incorrectly for base-2 numbers in computers. It should never have been 1 kilobyte means 1024 bytes. This is just the move to fix a long standing problem.
  • Re:Thing is (Score:3, Insightful)

    by bunratty (545641) on Saturday March 27, 2010 @12:11PM (#31640074)
    I care because if a sector is 4096 bytes, I can easily tell how many sectors a 4 MiB file takes (1024). Let's say someone says a file is 4 MB. How many 4 KiB sectors is that?
  • by polar red (215081) on Saturday March 27, 2010 @12:11PM (#31640078)

    you don't need to second-guess.

    a few years ago you didn't need to: 1kb was 1024 byte. it was defined like that. why don't we define 2 as 1 and 1 as 2 next ?

  • by mmontour (2208) <mail@mmontour.net> on Saturday March 27, 2010 @12:13PM (#31640096)

    As long as they use the correct prefix, I don't really mind whether they use base 2 or 10 to display the numbers.

    RAM sizes are naturally powers of 2 due to how the individual memory cells are addressed, so it makes sense for RAM capacity to always be listed in GiB.

    Hard drives, on the other hand, have nothing that is fundamentally based on a power of 2. They arbitrarily use a sector size of 512 (or 4096) bytes, but everything else (number of heads, number of tracks, average number of sectors per track) has no power-of-2 connection. Therefore there's nothing wrong with reporting their size in SI notation.

    The original shorthand of calling 1024 bytes a "K" was not too bad because it's only a 2.4% error. However the error gets worse as you go up each level, and by the time you're talking about a TB/TiB it's something that people actually care about.

  • by Anonymous Coward on Saturday March 27, 2010 @12:21PM (#31640182)

    It was never defined that way!

    "kilo" has always meant "1000". That is the way that IT is DEFINED.

  • by zippthorne (748122) on Saturday March 27, 2010 @12:25PM (#31640226) Journal

    There has never been a point since the introduction of the 1024 "binary k" prefix that you didn't have to second-guess. RAM was different from disk, before that communications was already using SI kilo (or I should say, what would become SI kilo, since they predated the codification of SI).

    The "binary" prefixes have always been problematic and don't help new people entering the field to understand anything, so they ought to go, or at least be segregated out so that there can be no confusion.

  • Re:Really annoying (Score:4, Insightful)

    by dingen (958134) on Saturday March 27, 2010 @12:28PM (#31640258)

    I've been using computers for 20+ years and I do _not_ want to change how I think file sizes, especially since I feel that base 10 is the wrong way to count.

    How is it possible you survived working in IT for over 20 years and not being able to adapt to radical changes? These sort of things happen all the time. One moment you're working from LSB upward, then you're suddenly working from MSB downward. 8 bit changed into 16, into 32 and now in 64. Filenames can't be longer than 8 characters and now they can. A file can't be larger than 4 GB and now it can. And now finally, operating systems are beginning to understand SI units (which we've been using for all sorts of applications for hundreds of years) and *THAT* is a problem?

    What's next? Imperial units for us Europeans?

    A better comparison would be using metric units in the US, because metrics are based on SI and imperial units are more like the weird way bits and bytes are counted into kilobytes, megabytes etc.

    Saying that 1024 is a kilo never made any sense to anyone. I'm really glad we're finally entering an age where computers represent datasizes in units people can understand.

  • by darkpixel2k (623900) <aaron@heyaaron.com> on Saturday March 27, 2010 @12:29PM (#31640272) Homepage

    Apple started using SI prefixes half a year ago with Mac OS X Snow Leopard.

    ...and look how well that worked out for them. I'm still not an Apple user...

  • by Espectr0 (577637) on Saturday March 27, 2010 @12:32PM (#31640304) Journal

    a few years ago you didn't need to: 1kb was 1024 byte. it was defined like that. why don't we define 2 as 1 and 1 as 2 next ?

    Because it was wrong to do so. Kilo is a SI prefix and it denotes one thousand. It should mean that everywhere. This is a good decision.

  • Re:Really annoying (Score:3, Insightful)

    by Gorath99 (746654) on Saturday March 27, 2010 @12:33PM (#31640318)

    What's next? Imperial units for us Europeans?

    Quite the opposite. The imperial units are the base 2 ones. After all, kilo means 1000, not 1024, both in the original Greek and in the SI system that most of the world uses.

    The HDD manufacturers were right (albeit for all the wrong reasons, of course). Good for Apple and Cannonical for recognizing this. I hope the rest of the world follows suit and becomes (SI, IEEE, ISO/IEC [wikipedia.org]) standards compliant.

  • by Reemi (142518) on Saturday March 27, 2010 @12:34PM (#31640338)

    I am more confused by people mixing b (bit) and B (Byte).

  • by Culture20 (968837) on Saturday March 27, 2010 @12:37PM (#31640366)
    I've used Ubuntu exclusively on my desktops for several years now. It's nice to know that I can always switch to another distro when they do something BAT SHIT INSANE like this: https://wiki.ubuntu.com/UnitsPolicy [ubuntu.com]

    Change the GUI window buttons from right to left? Meh. Change the way file sizes are read so that User X and User Y see different file sizes using the same filesystem, even potentially the same remotely mounted disk?

    Now I have to draft a letter to our research department telling them to stay the hell away from Ubuntu because their data will potentially be wrong (unless they take pains to remember the kilo=/=kibi switch).
  • by polar red (215081) on Saturday March 27, 2010 @12:39PM (#31640392)

    when the C64 came out with 64K No-ONE doubted it had 65536 Bytes of RAM. if it would came out now, there would be confusion, so the kibi-business introduced confusion. people who don't understand the difference between binary and decimal have no place in IT

  • Good move (Score:5, Insightful)

    by the_other_chewey (1119125) on Saturday March 27, 2010 @12:40PM (#31640396)
    I'm surprised by the majority here that is against this. What kind of nerds exactly are you?
    SI prefixes are defined as base-10, period. Every other use is simply wrong.
    Being consistently wrong for a very long time doesn't make it better, it is just proof of
    an unwillingness to admit to a stupid initial mistake you didn't even make yourself.
    As nerds, you're supposed to be better than that.

    How can you be all for standards-compliance with browsers and rile against a much
    stronger, decades-old ISO standard (which is based on a centuries old definition from the
    beginning of the metric system - "kilo" has been 1000 for over 200 years)?

    On the other hand, you are the same crowd regularly writing about "mbit/s" while meaning "Mbit/s",
    thereby being off by just a tiny, unimportant, paltry factor of a billion.
    Seriously, what's wrong with you?

    -- an annoyed scientist
  • Re:Really annoying (Score:3, Insightful)

    by TheVelvetFlamebait (986083) on Saturday March 27, 2010 @12:47PM (#31640468) Journal

    Actually, it's more like changing from the imperial system to the metric system. Sure, the imperial system's conversions made sense in the context that they were created, but for a sense of consistency and predictability, you can't beat base 10 metric.

  • by darkpixel2k (623900) <aaron@heyaaron.com> on Saturday March 27, 2010 @12:51PM (#31640528) Homepage

    a few years ago you didn't need to: 1kb was 1024 byte. it was defined like that. why don't we define 2 as 1 and 1 as 2 next ?

    Not really [wikipedia.org].

    Looking at the wikipedia article you linked, the only different between the two colums in the table on the right are:
    1. Using 1024 verses 1000
    2. Using gay names like mebi instead of mega

    Wouldn't it be about a billion times easier to leave it as 'mega' and just remember that when you are dealing with base 2 methods of storage, it's 1024 (a power of two) rather than 1000 (a power of ten).

    In other words:
    * leave everything in the IT industry the way it is
    * tell the HD makers that they are wrong to measure in base ten (since the US Gov already requires them to put that on their packaging, no big deal)
    * No one has to sound retarded when talking to the 99% of the population who has no clue about this stupid base2/10 war with hard drive marketing droids by saying 'mebi' or 'gibi'.

  • Re:Good move (Score:4, Insightful)

    by presidenteloco (659168) on Saturday March 27, 2010 @12:52PM (#31640540)

    Before, the situation was simple.

    Everything not binary-represented-information related used base-10.

    Everything binary-represented-information related (computing related, bandwidth related etc) used base 2, because the
    most important thing is how much information is being passed around or stored, and base-2 is the natural unit for
    measuring information, which comes in bits, and whose complexity is related to powers of the number of bits.

  • by MooUK (905450) on Saturday March 27, 2010 @12:52PM (#31640542)

    Which, unless otherwise specified, we assume we are.

  • by gnasher719 (869701) on Saturday March 27, 2010 @12:58PM (#31640620)

    a few years ago you didn't need to: 1kb was 1024 byte. it was defined like that. why don't we define 2 as 1 and 1 as 2 next ?

    Who modded that as insightful? 1kb was never 1024 byte. It varied between 1000 bits = 125 byte or 1024 bits = 128 byte, but it was never anything near 1000 bytes. As Shuttleworth said, Ubuntu is not controlled by democracy. I'll say it shouldn't be ruled by idiocracy.

    Reporting 1 KB = 1000 bytes also fixes the annoying thing that a line transferring 1 MB/sec (which _always_ meant 1 million byte per second) supposedly doesn't manage to transmit 1 MB of data within one second. (Yes guys, bandwidth was _always_ reported using SI prefixes).

  • by Mistlefoot (636417) on Saturday March 27, 2010 @01:02PM (#31640668)
    I have read most of the comments below I am replying to you.

    You have pretty much hit the nail on the head alluding to Mr Shuttleworth's "open source is not a democracy" comment. When something as simple as whether 1000 or 1024 should be used causes such dissension. In effect, this is meaningless. Pick one and life goes on. It becomes the norm. Simple. Done. Over. And everything works fine.

    But no. In such a technically meaningless semantic discussion there is still dissension how many years later? "Shit, or get off the pot" as the old saying goes. You can argue about something for how many years before you just have to do it. Dictate and get it done sometimes needs to be the answer.
  • by TheRaven64 (641858) on Saturday March 27, 2010 @01:07PM (#31640692) Journal

    Other posters have pointed out that bits and bytes are not SI units, but they've not pointed out that we use 1024 because it's more useful. We use base 10 for physical quantities because it means that you can very easily do base-10 logarithms and most arithmetic on physical quantities is easier if you can do logarithms on the base that you use in your head.

    Storage is always indexed by some binary quantity, so you need to do base-2 logarithms. You can trivially calculate how much space a 32-bit address space gives you: 2^32 bits, divide the 32 by 10 gives you 2^22 KB, 2^12 MB, 2^2 GB, 4GB. Try doing that with 1KB = 1000B in your head. You can easily tell how much space your 32-bit filesystem can store if it is addressing 512B blocks (the size of most hard disk blocks). 512 is 2^9, so it's 2^9 x 2^32 bytes. Add the exponents and you get 2^41 byes, or 2TB. What happens if we start using 4KB blocks instead? Well, 4 is 2^2, K means 2^10, so 2^12 x 2^32 = 2^44, or 16TB.

    Redefining KB makes these calculations harder. The only kind of calculations it makes easier are things that involve bytes and some other SI units that use the SI prefixes in the same equation. About the only other SI quantity that you ever see in an equation with bytes is seconds and you almost never talk about kiloseconds or megaseconds...

  • by OrangeCatholic (1495411) on Saturday March 27, 2010 @01:10PM (#31640716)

    When I was a junior in college, they gave us an exam where we had to do some arithmetic in MB. Pretty much all of us, including the teachers, did it as a factor of 1,000,000.

    When they handed the exam back, the teachers noted that a couple of students had done it as 1,048,576. The whole class was aghast that we had made such a simple mistake.

    So, is 1000 "right" because that's what a whole bunch of C.S. students did? Or is it wrong because we all agreed that we were wrong afterwards?

  • Re:Annoying... (Score:5, Insightful)

    by growse (928427) on Saturday March 27, 2010 @01:16PM (#31640754) Homepage
    Nothing? How many clocks per second does a 2GHz CPU run at?
  • Re:Really annoying (Score:2, Insightful)

    by NNKK (218503) <nknight@runawaynet.com> on Saturday March 27, 2010 @01:16PM (#31640760) Homepage

    How is it possible you survived working in IT for over 20 years and not being able to adapt to radical changes? These sort of things happen all the time. One moment you're working from LSB upward, then you're suddenly working from MSB downward.

    Most people working in IT in the last 20 years have dealt almost exclusively with x86 CPUs. More to the point, really, most IT people in the last 20 years didn't care about byte order in the first place -- by 1990, not everyone in computing was a programmer, and most of the ones who were didn't care about bitwise operations very often.

    8 bit changed into 16, into 32 and now in 64.

    So what? Lots of numbers increase, the units used with them rarely change, and all of those are bits.

    Filenames can't be longer than 8 characters and now they can.

    Just another size increase. Not to mention the fact that by 1990, such a limitation was effectively a DOSism. Anyone on a Unix box didn't give a shit.

    A file can't be larger than 4 GB and now it can.

    Another simple size increase that did not change the units in question. And look at that, you just referred to a base-2 gigabyte, since that was the limitation.

    Not to mention that on many filesystems, it was actually a 2GB limit, not 4GB.

    And now finally, operating systems are beginning to understand SI units (which we've been using for all sorts of applications for hundreds of years) and *THAT* is a problem?

    How many knuckles am I holding up?

    Saying that 1024 is a kilo never made any sense to anyone. I'm really glad we're finally entering an age where computers represent datasizes in units people can understand.

    Odd. The few times I've had to explain the concept to anyone, they understood it immediately. "Computers operate in base-2, and a funny result is that kilobytes, megabytes, etc. end up being 1024."

    What sub-human protoplasmic entities do YOU deal with?

  • by darkpixel2k (623900) <aaron@heyaaron.com> on Saturday March 27, 2010 @01:20PM (#31640794) Homepage

    Exactly. Dont give in to the mistakes of HDD manufacturers and legalize their wrong advertising.

    Doesn't pint/quart/gallon differ according to geography. Pint [wikipedia.org], Gallon [wikipedia.org] and so on.

    This article and this time of year piss me off.

    You're exactly right. We don't suddenly re-define an established standard. And when it comes to physics, we don't suddenly re-define time...like every year when the stupid US government decides that it's magically an hour earlier or an hour later.

    When I make a cake, I don't use 1 cup of flower and then decide to make bread, so I redefine the size of 1 cup to make reading the recipe easier...

  • by Animaether (411575) on Saturday March 27, 2010 @01:20PM (#31640800) Journal

    Now I have to draft a letter to our research department telling them to stay the hell away from Ubuntu because their data will potentially be wrong (unless they take pains to remember the kilo=/=kibi switch).

    If your research department...
    1. Doesn't work in bytes
    2. Doesn't know how to tell the difference between kB/MB/GB/TB and KiB/MiB/GiB/TiB ...then maybe you (or rather the person in charge) need to educate your research department on these matters, rather than compiling a list every month of operating systems and products to avoid/pay special attention to, which is only bound to grow longer.

  • by Theovon (109752) on Saturday March 27, 2010 @01:21PM (#31640828)

    Many computer nerds like to tout themselves as geniuses who have flexible minds. But the truth is that we're all afraid of change. And this switch from KiB to KB is change. It's not what you're used to, so it's going to confuse you.

    But as a geek myself with an obsession for clear and precise terminology, I welcome the change. No longer will I wonder if someone's talking about KB vs. KiB, because it'll be consistent and explicit, at least on the computer systems developed by flexible-enough-minded people who are both willing to change and willing to correct a long-confusing problem.

    It's true that the HD makers have taken advantage of this confusion. Back in the day when people almost always said KB when they meant KiB, HD makers used KB. But the fact is, once we adapt our terminology to be less ambiguous, we really can't be mislead by them anymore, and their deceptive marketing practices will be moot (at least when it comes to bytes of storage).

    So, to summarize, stop being a stick in the mud and learn to adapt to change. Computers are and always have been an aspect of change in our society. Get over it and get with the program.

  • by hanabal (717731) on Saturday March 27, 2010 @01:23PM (#31640842)

    why does the It industry get special treatment. I thought the IT industry was one of the industries that wanted to use established standards the most for interoperability. Then you say the the IT industry want to go against the established standard for something that is really really really insignificant and you would all get used to it in about 2 months after the switch

  • by pandronic (1275276) on Saturday March 27, 2010 @01:35PM (#31640924)

    Don't you know? Apple doesn't like options

  • by ShinmaWa (449201) on Saturday March 27, 2010 @01:39PM (#31640964)

    If you went to the terminal and saw this file

    file.big 17,179,869,184

    I suspect that you would naturally say that that file is about 17 gigs. Actually, it is 16 GiB exactly.

    However, just looking at the file, no one would ever instinctively say that file.big is 16 GiB. The reality is that base-10 is what people naturally use and so it makes sense for the user interface to reflect that.

  • by CAIMLAS (41445) on Saturday March 27, 2010 @01:52PM (#31641094) Homepage

    Physical engineering (at least many of the fields) still use English units - inches, feet, yards, miles. Bridges, buildings, and most of what you see around you are

    EE and manufacturing seem to use some of both, largely SI in small electronics and the like. But that isn't "every other field".

    Honestly, it wouldn't be an issue either way: the problem is that the 'standard' isn't consistent across processing and storage (primary, secondary, etc.). Most architectures have used base-2 to represent these things because that is how the computer (binary) works, and using a base-10 method for representing it is a nonsensical abstraction as a result.

  • Re:Really annoying (Score:2, Insightful)

    by SvnLyrBrto (62138) on Saturday March 27, 2010 @01:54PM (#31641106)

    > Saying that 1024 is a kilo never made any sense to anyone. I'm really glad we're finally
    > entering an age where computers represent datasizes in units people can understand.

    Yes it does. It makes perfect sense to anyone who knows anything at all about computers, or at least didn't sleep through their entire degree from start to finish. Using base-2 isn not some silly and arbitrary thing someone made up... like using the size of some king's toenail as a unit of length... it's fundamental to the way computers work! (Unless YOU know of some way to make semiconductors work with ten states instead of two that I... and the entire rest of the industry... seem to have missed.)

    Base-10 only came about so some shady hardware component vendors could rip us off. And systems vendors and programmers should never have let the SOBs get away with it.

  • Re:Really annoying (Score:3, Insightful)

    by l3v1 (787564) on Saturday March 27, 2010 @02:00PM (#31641170)
    Saying that 1024 is a kilo never made any sense to anyone

    To anyone except the countless people who actually knew their way around computers and what they were doing. Flexiblity to adapt to situational changes of a field has nothing to do with idiots changing the nomenclature of a well established technical, engineering and scientific field because they can't fathom words can mean different things in different fields and contexts.
  • Re:Thing is (Score:5, Insightful)

    by beelsebob (529313) on Saturday March 27, 2010 @02:00PM (#31641174)

    Actually, they are. This is most likely for the exact same reason as apple likely did it –reduced support costs. They don't need to deal with shit tons of people complaining that their 1000GB disk isn't 1000GB, it's only 931.3GB.

    Along with of course the most obvious reason – it's *correct* that way.

  • by The_Wilschon (782534) on Saturday March 27, 2010 @02:00PM (#31641176) Homepage
    OTOH, if the OS is reporting GiB, then it ought to say GiB, not GB. Reporting that a "10 GB" (written on the box) hard disk has "9.3 GB" of space is confusing and misleading. If your definition of correctness in notation is adherence to internationally accepted standards for notation, then it is also incorrect. If you RTFA, then you will find that Ubuntu 10.10 is requiring that all applications either report "10GB" or "9.3 GiB", but not "9.3 GB" or "10 GiB". This is, in fact, a switch to correct and less misleading behavior. Whether or not it is more or less confusing may be a different matter.
  • by Anonymous Coward on Saturday March 27, 2010 @02:03PM (#31641200)

    'kilo' is not a word. It's a prefix. It has a standard usage, but its usage is not enforced by law (at least outside France and its Language Police). When computer scientists needed names for large numbers of bytes, they used the same prefix, but they used it with a different usage, where 'kilobyte' referred to 1024 bytes. So that's how it was defined. And that actually made (and still makes) sense, since we're talking about bytes here, a base-2 unit (8 bits), not bits. Mixing bases in the same measurement, on the other hand, makes no sense whatsoever.

    Personally, I think we should all move towards talking about sizes in terms of bits, since that's the more human-friendly form of measurement.

  • by johny42 (1087173) on Saturday March 27, 2010 @02:14PM (#31641298)

    I'm afraid your post is a bit misleading too. They are also requiring developers to use base-10 in most places:

    base-10 should be used to represent network bandwidth and disk sizes while RAM sizes should use base-2. File sizes can either be shown in both base-10 and base-2, only base-10, or a user option to choose between the two (but with base-10 set as the default).

    The article also mentions that Mac OS X already does this since 10.6. Maybe there's a Slashdot article about that too?

  • by swillden (191260) <shawn-ds@willden.org> on Saturday March 27, 2010 @03:02PM (#31641670) Homepage Journal

    when the C64 came out with 64K No-ONE doubted it had 65536 Bytes of RAM

    No kid playing with his first or second computer, anyway. Old hands used to dealing with memory measured in kilowords (with the standard SI meaning of "kilo") would have had to ask. They might have had to ask how big a byte was, too. There's a reason standards call them octets, you know.

    You just think this is some kind of carved-in-stone standard because it's what you were first exposed to.

  • Re:Why? (Score:3, Insightful)

    by prockcore (543967) on Saturday March 27, 2010 @03:17PM (#31641800)

    even if computers don't operate that way.

    Harddrives and floppy disks aren't bound by base-2 in the slightest.

    A floppy disk is a great example, a track can have any number of bits stored on it. The OS cuts that track into sectors of whichever size it wants, this is *arbitrary*. Because of sector encoding on floppies, it's impossible to read only part of a sector... you have to read the entire sector. Early computers made sectors the same size as a page of RAM in order to load an entire sector into a single page of RAM.

    These are all decisions made by programmers, nothing inherent in the computer relies on sector sizes to be base-2. Nothing inherent in the media requires anything to be base-2.

  • Mod parent up (Score:3, Insightful)

    by Kludge (13653) on Saturday March 27, 2010 @03:29PM (#31641876)

    1kB was never defined as 1024 bytes. People just started calling 1024 bytes as 1kB because it was close enough, and on one cared about being 2.4% off. Unfortunately as everytime we leap another 10^3 we're off by another 2.4%, and by the time we get to 10^12 we're off by 10%.

  • Maybe for all the physicists, chemists, and engineers; but has kilo never meant 10^3 for computer programmers, computer engineers or computer scientists. Same with mega- giga- and so one. They have all each had a very specific meaning in the base 2 number system, which is ultimately the most important base system for people working with computers.

    We don't have 10 hours a day, 10 days a week. We don't have 10 bits in a byte or 100 degrees in a circle. I'm a huge proponent of the SI system but only in areas where it is appropriate to apply it. Lengths, weights, magnetic flux density, all fine. But there are many applications and areas which are not appropriate to shoehorn into the decimal system. Binary computer memory sizes are one such application. It is not appropriate to group base 2 numbers using a base 10 units.

  • by dynamo (6127) on Saturday March 27, 2010 @03:51PM (#31642056) Journal

    Here here!

  • by MarsCtrl (255543) on Saturday March 27, 2010 @03:53PM (#31642074) Homepage Journal

    Within computing, "kilobyte" has always been an ambiguous term - not only was the meaning of "kilo" ambiguous, but "byte" could refer to anywhere between six and nine bits. This wasn't cause for concern as long as systems were internally consistent, so engineers continued to use the term due to the utility it offered. This consistency is no longer possible since computers are now key components in communication systems which have always interpreted "kilo" as a SI unit.

    There's a strong parallel here to the "nautical mile", which was developed because of its tremendous utility in navigation, but which is confusing to those who don't realize that "mile" means something different on a boat. If you transfer your GPS unit from your car to your boat, which type of "mile" should the device use? If you copy a 10 GB file over a 8 Mb/s data link, how long is the transfer going to take?

    Computer specialists can be expected to understand the special meaning of "kilo" in certain contexts, but what of those who work outside the field of computer engineering? The modern computing experience is built on tiers of abstractions that allow the "experience" of using a computer to differ greatly from how the computer is actually designed (e.g, file sizes are already given as "quantity of information stored" instead of "disk capacity consumed"), so it's reasonable to use the word "kilo" the way 95% of the population already understands it.

  • by Hurricane78 (562437) <deleted.slashdot@org> on Saturday March 27, 2010 @04:01PM (#31642132)

    I’s not about being flexible. It’s about not being retarded.
    We are the experts on the subject. We know better. Period.
    So for someone to tell us how our computer should work, it must be someone who is even more of an expert.
    A standards committee is not someone like that.

    The simple fact is, that all computers nowadays are base 2. So for the numbers to be useful, they must also be base two. Half your ram, hard disk, cache, display resolution, data rate, etc, will always be a nice round number in base 2, 8 or 16, and a very hard to remember number in base 10.

    I think anyone who uses a computer (as in, uses it for what it is there: to automate things), will continue to use base-2-based number systems. And base 10 will be for the appliance fiddlers who only play with colorful clickables, and should in fact not called users at all.

  • Re:Thing is (Score:5, Insightful)

    by beelsebob (529313) on Saturday March 27, 2010 @04:10PM (#31642180)

    And they accomplish this by measuring wrong? Great effing job!
    Wrong? By who's standard, the SI standard, the ISO standard and the IEEE standard all agree on this point.

  • by Cochonou (576531) on Saturday March 27, 2010 @04:21PM (#31642284) Homepage
    Well, it depends on what you are talking about. The situation is not as clear cut as you depict it.
    1 kb on your disk is usually defined as 1024 bits... but 1 kb/s is usually defined as 1000 bits/second. As an example, a 1.5 Gb/s SATA interface is running with a 1.5 GHz clock, so it will transfer 1500000 bits per second (actually, the number of effective bits will be lower as it uses 8b/10b balancing).
  • by TClevenger (252206) on Saturday March 27, 2010 @04:29PM (#31642360)

    What do you mean never? "Kilo" has always meant 10^3 for HDDs, likewise for mega, giga, etc.

    Sorry, you're wrong; disks used base-two definitions, too. A 360K floppy is 362,496 bytes formatted, and a Seagate ST-225 20 megabyte hard drive had a little over 21,000,000 bytes formatted. It wasn't until some hard drive manufacturer couldn't quite hit a gigabyte that they redefined "gigabyte" so that they could call their 976MB drive "1 gigabyte."

  • Re:Annoying... (Score:2, Insightful)

    by dissy (172727) on Saturday March 27, 2010 @04:34PM (#31642388)

    Base-10 has no place with computers because nothing with computers is calculated/measured in base-10

    Except when that computer is on a network, which is all base 10. Or when the computer has a CPU in it that is running, which is measured in base 10.

    Arguably 100% of the things computers exist for, humans, calculate and measure in base 10.

  • by AaxelB (1034884) on Saturday March 27, 2010 @04:46PM (#31642458)

    But there are many applications and areas which are not appropriate to shoehorn into the decimal system. Binary computer memory sizes are one such application. It is not appropriate to group base 2 numbers using a base 10 units.

    I agree entirely. However, SI prefixes *are* in base 10, and just redefining them in specific contexts to mean something in base 2 is unnecessarily confusing. Kilo is accepted to mean thousand, and redefining it in specific contexts to mean 2^10 is just unreasonable. To use your phrase, it's not appropriate to shoehorn this system of decimal prefixes into describing a naturally binary system (which is precisely what happened in CS).

    I understand it's how we've been doing things for decades, but why on earth are so many CS people arguing *against* decreasing ambiguity? I find the whole KiB thing to be a relatively elegant solution, which maintains the familiar letters so there's nothing new to learn, but makes it clear what units you're using. The only reason to resist it that I can see is just blind and unthinking resistance to change -- the exact same reason so many people resist the metric system and SI at all.

    You seem to be arguing "if it ain't broke, don't fix it", but I think it is a little broke and we should fix it.

  • by Anonymous Coward on Saturday March 27, 2010 @05:11PM (#31642664)

    We don't have 10 hours a day, 10 days a week. We don't have 10 bits in a byte or 100 degrees in a circle.

    We also use words like "day" and "hour" instead of deca-hour or deci-day.

  • Computer memory, in an abstract sense, tends to be looked at in a hierarchical way:
    * Registers
    * Caches
    * RAM
    * Secondary storage (swap)

    A filesystem is a datastructure, arguably just nominally imposed on a dedicated swap-space of sorts.
    When you buy a gig of RAM, you expect 2^30 bytes, not 10^9 bytes. I've never understood why HD think that their "secondary storage" does not belong under the paradigm of "computer memory" when talking about sizing, despite the fact that all modern OS's use swap space, and filesystems are all data structures whose constituents tend to fall on word boundaries.

  • by alonsoac (180192) on Saturday March 27, 2010 @05:26PM (#31642792) Homepage Journal

    I disagree. It is not a matter of knowing about math. I should be able to interpret any measurement just by knowing what each unit stands for, I shouldn't need any deeper knowledge about math or history of computing. Because then I will be lost trying to interpret numbers from other fields, from other countries, etc and the whole point of the SI is to have a global standard.
    Someone somwhere just ignored the proper definition of kilo and redefined it as 1024 and then most of us have continued that mistake. I see no problem in correcting this once and for all. It seems silly that trying to correct the problem would upset people.

  • by alonsoac (180192) on Saturday March 27, 2010 @05:32PM (#31642814) Homepage Journal

    The end user and even most professionals in the field will never need to do this kind of math. This might be an issue for people who work with storage and address space but after you are done with that there is no need to worry about binary for the rest of your day and no need to force the rest of the world to redefine units just because it makes your calculations easier.

  • by mdwh2 (535323) on Saturday March 27, 2010 @05:42PM (#31642886) Journal

    However, SI prefixes *are* in base 10

    Bytes (or bits) are not SI units.

    but why on earth are so many CS people arguing *against* decreasing ambiguity?

    The ambiguity is introduced by those people saying it should instead be 1000 kb. The "KiB" thing doesn't really help, because now you have no idea who's using the new system or not. It would have been more sensible to use the alternative letters for the 1000 system. And whilst I sometimes see people write "KiB" etc, I've never once heard people say "Kibibyte" or anything like that...

    The only reason to resist it that I can see is just blind and unthinking resistance to change -- the exact same reason so many people resist the metric system and SI at all.

    I'll say it again: Bytes (or bits) are not SI units. And if they were, the bit would be the far more sensibly choice as a base unit.

    (Even the SI unit has its issues - that the kilogram is the base unit for mass. A gram should really be called a millikilogram - but thankfully people aren't pedantic when it comes to mass.)

  • by AaxelB (1034884) on Saturday March 27, 2010 @06:09PM (#31643056)

    Bytes (or bits) are not SI units.

    Then why use SI prefixes? More generally, why use prefixes with established and well-known definitions to mean something else? It's confusing. (And it's entirely irrelevant that bytes aren't an SI unit.)

    The ambiguity is introduced by those people saying it should instead be 1000 kb.

    Make no mistake, the ambiguity was introduced by the people who used "kilo" to mean 1024. Blaming the people who hear "kilo" and think "thousand" is just silly, because thats what the prefix actually means. My whole point is that using kilo (and the rest) to mean something special in specific contexts is ambiguous, because the listener must either ask or guess to know which you mean. You're saying that everyone should know the special case that kilobyte means 1024 bytes, but that's already a lost cause; disk makers have been using kilo to mean 1000 for years, and it'll probably never be really sorted out.

    The "KiB" thing doesn't really help, because now you have no idea who's using the new system or not. It would have been more sensible to use the alternative letters for the 1000 system. And whilst I sometimes see people write "KiB" etc, I've never once heard people say "Kibibyte" or anything like that...

    The point of using the kibi-style prefixes is to make it clear what you're talking about. "Kilobyte" is already ambiguous, and using kibibyte more doesn't change that. It just provides an unambiguous option.

  • by Anonymous Coward on Saturday March 27, 2010 @07:22PM (#31643472)

    "kilo" is not a binary prefix. And while it is possible to "represent" 1 kB (1000 bytes) as 1024 bytes, that representation would require 24 redundant bytes. That's not what Ubuntu was doing previously. They were using the units wrong. Now they've switched not from, but to binary prefixes: kibi-, mebi- etc. -- apps have the choice to use these or use the decimal prefixes as they have always been defined: to denote decimal factors. Everything as should be. I just wonder what took them so long to acknowledge the standard, and why Slashdot editors are so dense.

  • by stonewolf (234392) on Saturday March 27, 2010 @07:27PM (#31643494) Homepage

    The process you just went through is well known to psychologists. It is called "objectification". That is the process by which you mentally reduce the person you disagree with to a non-human status so that you can disregard them. If done well you can kill them without guilt because they are, in your mind, not a real human. Some good examples of objectification can be found in the history of WWII. The Nazi propaganda concerning Jews, and American propaganda about the Japanese are great sources.

    BTW, you might want to look up the origin of the "N" word. It is not what you think it is. It is just one of many words with the same origin that all mean "black". It became a racial slur by being use being used as one. Hate speech becomes hate speech by being use by the haters. That is the same way your favorite words have come to be hate speech.

    I guess I should mention that there is nothing in the definition of socialism that fits your description of it.

    I have noticed that in all of your replies you have tried to defend your use of the words. But, you have never actually denied that they are hateful. That is a good sign. It is likely that you are reacting the way you are just because you feel embarrassed about being called on it. I would be lying if I claimed I have never done that. It really is easier to just admit the error, or at least not defend it, than to lot someone like me push you further and further into a corner.

    Stonewolf

  • by jensend (71114) on Saturday March 27, 2010 @07:42PM (#31643582)

    No, the concern is because the new binary prefixes are awful. "Kibibytes"? [kibblesnbits.com] You gotta be kidding me. I am not using that word in public. If they'd come up with some non-ridiculous-sounding names people would fall in line.

  • by swilver (617741) on Saturday March 27, 2010 @08:29PM (#31643838)

    There's no need. We can just use our own OS that will work as we want it to.

  • by Anonymous Coward on Saturday March 27, 2010 @08:57PM (#31644004)

    The simple fact is, that all computers nowadays are base 2. So for the numbers to be useful, they must also be base two. Half your ram, hard disk, cache, display resolution, data rate, etc, will always be a nice round number in base 2, 8 or 16, and a very hard to remember number in base 10.

    This is nonsense. Sizes of files and free space are almost never nice round numbers in either base 2 or base 10. I don't know anybody who has a problem comparing quantities of ANYTHING when using decimal.

    Most of the time when we talk about disk space and memory space, we use decimal representations anyway. The 1024-based quantities are really only useful for programmers who are dealing with addresses, and even then they usually use hex.

  • by shaitand (626655) on Sunday March 28, 2010 @01:34AM (#31645372) Journal

    As a geek, I am in favor of change... for the better. Geek and pedantic troll are not synonymous. I do not care about grammar distinctions that do not add a useful and functional clarity just for their own sake. I do not support changes to existing well defined and well understood prefixes unless there is a purpose.

    Satisfying the anal few by implementing technical correctness of a prefix is NOT a fair trade for breaking every spec of existing computer literature and the majority of software. It is not a valid justification for breaking the math used and making it more difficult. Your pedantic correctness will decrease the efficiency of IT as a whole.

    Sorry if you want to pretend people are somehow clinging to old ways. I guess some of us fuddy duddies don't want to get behind the idea of breaking the existing functionality with zero functional gain.

  • by i.of.the.storm (907783) on Sunday March 28, 2010 @03:09AM (#31645714) Homepage
    Oh right, within tolerances. But you know what I mean. When you ask for a 1k resistor within 5%, you don't want something within 5% of 1024, you want it within 5% of 1000.

news: gotcha

Working...