Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Linux Software

The Evolution of Linux 356

Taiko writes: "Kerneltrap.org has posted some of the more interesting messages from a recent kernel mailing list discussion. It started with a post on proper indentation, but turned into something a bit more. There are some posts by Linus and Alan Cox about the nature of design, computer science, Linux development, evolution, and more. Quite interesting and funny."
This discussion has been archived. No new comments can be posted.

The Evolution of Linux

Comments Filter:
  • The role of Linux in the history of computer science will turn out to be that Linux kept the Open Source model _open_ on the inevitable pathway to Technological Singularity. [caltech.edu]

    Take for example the latest hot Linux gadget, the Sharp SL-5000D Zaurus PDA for Developers [sharpplace.com] which runs both Linux and Java, and is therefore an appealing platform for the further development of Mind.JAVA Artificial Intelligence [angelfire.com] in the Linux environment -- everyman's last great hope of avoiding a catastrophic Microsoft take-over of the 'Net.

    The world owes a lot to Linus Torvalds, Richard Stallman, Eric S. Raymond, Tim Berners-Lee and the countless other heroes of the Open Source futurity either posting here on SlashDot or toiling messianically away in obscurity.

  • by JCCyC ( 179760 ) on Sunday December 02, 2001 @09:13AM (#2643513) Journal
    Will use Outlook only -- "Prayer will defend us from viruses", says school principal.

    This will not do good for the acceptance of Linux in the Bible Belt [chick.com] -- Linux evolved through natural selection [literature.org], while Windows was created by God [microsoft.com].
    • Most people seem to have this misunderstanding. Natural selection IS NOT EVOLUTION! Most people the study natural selection agree that it is more or less right.

      The big difference between natuaral selection and evolution is that in natural selection you lose information (genes that aren't needed in that environment become the minority) Where as evolution says that we make information from nothing!

      If you look at linux you can see that PEOPLE are making it then natural selection is killing off the parts that aren't perfect. If you take out the people that make it then linux doesn't exists.

      A better example would be that you bought a brand new (clean)computer and turned it on and off 1 billion times and expected it to boot in linux/dos/windows because the bits might randomly produce an operating system. When you look at it this way it sounds crazy.

      Basically the point I'm trying to say is that every operating system was made by a creator(s) and then refined by natural selection. I don't mide when people have opposing views as long as they are informed views.> Evolution
      NOTHING -> SOMETHING -> NATURAL SELECTOIN + A LARGE NUMBER -> US

      Linux
      LINUS + PREVIOUS CODE -> NATURAL SELECTION + CODERS + LINUX TODAY

      There is a big difference.
    • [off-topic] Hey I thought slashdotter were up-to-date. The creationists lost badly in last year's elections, so evolution is back in the syllabus. Actually it was never out, it was a local option and most schools continued to teach evolution.

      [back on topic] What a wonderful exchange of ideas! Linus has me convinced. OS/free software is about chaos (as in math). It's about an infinite number of monkeys on an infinite number of keyboards. Sure if you've got a small team and a tight time-frame you'll need to have tight control over the project. But linux is great because of the chaotic (as in math) processes which produce it.
  • Great stuff! (Score:5, Insightful)

    by Prop ( 4645 ) on Sunday December 02, 2001 @09:25AM (#2643530) Homepage

    I enjoy reading Linus' thoughts so much.

    All around him, people try to make him or Linux more than it really is - and invariably, Linus brings it down a notch and puts it in perspective

    It's amazing that this guy gets constantly hero-worshiped, his baby created billion dollars of wealth (at one point, at least), and yet just keeps his feet firmly planted

    Compare that to the clowns that get high and mighty because they rUleZ at Quake, or on some IRC channel ... The geek community could learn a LOT from trying to emulate Linus' behaviour.

    • The geek community could learn a LOT from trying to emulate Linus' behaviour.

      I'm trying, but when I tell people I don't care about Microsoft they call me an elitist. Well, actually I am an elitist, but what's that got to do with it?

      Unrelated observation: stupid reporters ought to stop churning the same stereotypical questions and start reading the kernel mailing list if they want insight on Linus. His posts on the list are SO much more meaningful than what you can read in your standard Linus interview in Wired magazine.

    • Re:Great stuff! (Score:4, Interesting)

      by shlong ( 121504 ) on Sunday December 02, 2001 @10:48AM (#2643638) Homepage
      Wow, a +5 Troll. Ok, I'll bite

      Let's go through Linus' claims:
      • Linus claims that Linux has no guided direction and developes purely through evolution and luck. First off, I would view this as highly insulting if I was a major player like IBM, or even someone who has only minimally contributed. He is basically saying that these people are as useful as a random code generator. Even more importantly, his statement is not true. Linus guides the developement of Linux through his decision to accept and integrate patches. If it were truely evolutionary, Linus would set up a SourceForge project for Linux where anyone could check in changes. That still wouldn't totally eliminate direction, because someone would have to make the decision of when to cut new releases.
      • The analogy to selective breeding is wrong. Yes, we can speed up evolution through selective breeding, but we are only changing minor traits. Sure, you can breed a dog with long hair, a short snout, and good temperment, but what if you want to breed a dog with feathers, or a fifth leg? At the very best, that would take an incredible amount of time. The better solution is to research and apply genetics. Lets apply that to the kernel... we can either let the scsi mid layer slowly evolve into something useful, or we can sit down and give it a good design phase and have something that works in a much shorter period of time.
      • Windows does not succeed because of evolution and a deep gene pool. Windows succeeds because of 1) marketing, 2) aggressive business tatics against competitors, and 3) it's not so buggy that it's totally unusable.
      • Re:Great stuff! (Score:4, Interesting)

        by robinjo ( 15698 ) on Sunday December 02, 2001 @11:29AM (#2643743)

        My turn to bite :-)

        Linus is saying that Linux is evolving through countless small decisions. There is no One Big Plan. There are just ideas that are thought of. There's lots of code that is written. Some of it get into the kernel, some doesn't but gives new ideas for better design.

        If you'd read through the whole discussion, you'd notice how computer science was compared to alchemy. It's a young science that has years to go before genetices can be applied. And we still have years and years of research that has to be done before we understand genetics. There's lots of trial and error being done there too.

        Actually Windows has evolved a lot. Just look how much it has changed since Windows 3.1. It sure succeeds because of marketing and aggressive business tactics but that's only helping. MS wouldn't be able to compete against Linux with only Windows 3.1 no matter how aggressive they's be. So don't underestimate the effort behind developing Windows.

        Actually Windows evolved towards what people wanted in the nineties. But since Windows 98 it has also had Word Domination-plan which is not good for Windows in the long run. But as Microsoft has loads of cash, they can afford trial and error as long as they learn from their mistakes. And that's one thing they are good at.

        • Re:Great stuff! (Score:2, Insightful)

          by shlong ( 121504 )
          Yes, Windows changes. Just like most software (TeX being the exception), it must change. But it does not succeed because of evolution. Countless polls have been taken where people say, basically, "I don't give a rats ass about all the new shiny gizmos in the latest version of Windows. I wish they would just fix the bugs." But you know what, people still buy it.

          If Linux wants to be like Windows, that fine. Windows stands for 'good-enough' and mediocrity.
        • There is no One Big Plan

          I would say that the Unix skeleton that Linus so quickly dismisses counts as a One Big Plan. Look at all the horrible legacy that Linux carries from that: ports under 1024 require root privileges to bind, the entire ACL system, "everything is a file", mount points versus "drives", and a million other things.

          If Linus' view of history was actually true then we wouldn't be seeing any of those things. We would instead see something that was hodgepodge of Windows, Plan 9, Unix, BeOS, and other operating systems' features. That is not what Linux is today. Instead it is something Linus drives forward with his One Big Plan of a Unix-a-like Operating System.

          His evolution-centric view also ignores the fact that society at large mitigates against natural selection. When was the last time you saw someone being selected against because they didn't have 20/20 vision? And the choice of an operating system takes place in exactly that same society.
          • Re:Great stuff! (Score:2, Insightful)

            by Usquebaugh ( 230216 )
            Ummm....

            "Look at all the horrible legacy that Linux..."

            Yes there are problems with linux, but they do not constitute a plan, it's just they are better than anything else submitted. Just for the record mount points are so much better than drives, IMHO.

            "We would instead see something that was hodgepodge..."

            Linux is a hodge podge, just like Unix was. If some radical, and I mean radical, change in computing came along do you think Linux would not adopt it?

            "When was the last time you saw someone being selected against because they didn't have 20/20 vision?"

            Pilots for a start, then it depends on how severe the disability, blind people don't drive. Colour blind people cannot become electricians. Deaf people do not make very good opera singers etc etc etc
        • The question is if Linux is not the same as Windows in this respect, in that it does not succeed because of technical excellence, but because of politics and coincidence.

          Compared to most commercial Unixes, and even to most other free Unixes (*BSD) Linux technically lags behind, is less advanced. Linus may claim tha Sun is dead (I'm not so sure) but compared to Solaris Linux has a long way to go.

          However, Linux came at the right time, at the right place, and amongst others politics helped it (IBM and others found it good to use it as a weapon in the battle against Microsoft), and it had the luck that its competitors, the better designed NetBSD and FreeBSD, were having licensing problems at a crucial period in time.
      • Re:Great stuff! (Score:3, Insightful)

        by akc ( 207721 )

        Linus claims that Linux has no guided direction and developes purely through evolution and luck. First off, I would view this as highly insulting if I was a major player like IBM, or even someone who has only minimally contributed. He is basically saying that these people are as useful as a random code generator. Even more importantly, his statement is not true.


        I don't think he was saying that at all. In IBMs case they have a direction they are pushing and develop code in that area, but at the same time there is a vast array of people taking it in different directions. The net result is unplanned (as opposed to random).


        The analogy to selective breeding is wrong. Yes, we can speed up evolution through selective breeding, but we are only changing minor traits. Sure, you can breed a dog with long hair, a short snout, and good temperment, but what if you want to breed a dog with feathers, or a fifth leg? At the very best, that would take an incredible amount of time. The better solution is to research and apply genetics. Lets apply that to the kernel... we can either let the scsi mid layer slowly evolve into something useful, or we can sit down and give it a good design phase and have something that works in a much shorter period of time.


        Firstly, the lifecycle time of the kernel is down to a few days instead of years - secondly things do evolve - just look at the progress of the VM (either one). First attempt didn't get it quite right, so then there are some patches and things get a bit better, but something else is bust (etc etc). This seems quite close to the breeding approach (but is only one of a number of parallel directions for the kernel).


        Windows does not succeed because of evolution and a deep gene pool. Windows succeeds because of 1) marketing, 2) aggressive business tatics against competitors, and 3) it's not so buggy that it's totally unusable.


        I don't think Windows succeeded because of evolution either - it was a major mutation which occurred at a certain lucky point in history and wiped out most of its competitors. [Sure marketing and business tactics helped - but the real winner was the GUI interface (against DOS) and the fact that apple didn't open up their hardware whereas IBM did]. Don't we have something like this in the natural world? [My brain is addled with the thought of the mutant in Asimov's Foundation series]. The big question though is - in the long term will it continue to evolve fast enough to keep up when pressed with alternative species (like linux and the speed with which it is evolving!)

        • > and the fact that apple didn't open up their hardware whereas IBM did

          Actually, IBM sued Compaq for reverse engineering the PC BIOS and creating clones. Compaq just happened to win. It's not so much that IBM opened their hardware as that it was opened in spite of them. If Compaq had decided to clone Apple machines instead, the story might have been far different.
      • Re:Great stuff! (Score:3, Insightful)

        by Mr Z ( 6791 )

        The main point you missed, though, is that Linus is right at the macro level -- there is no overall design process for Linux as a system or an overall direction for Linux.

        At the kernel subsystem level, there's plenty of design, and plenty of goals, and plenty of localized direction. In the filesystem space, there was a lot of buzz around journalling filesystems. In the MM department, we had something more akin to controlled chaos... :-) And yes, the SCSI layer could use some actual careful design work.

        There was no overarching goal "We must optimize for market X" that drove any of this. Sure, some people want to run Linux on huge machines, and so they want journalling. Other people want to shove Linux into wristwatches and PDAs, and so they instead want to focus on memory footprint. And still others care about interrupt latency over throughput. So, each little care-about niche as their own little projects that pull Linux in lots of different directions at the macro level. Each individual project is very directed, and some have significant design work. But none of it is directed from On High as part of the Grand Plan for The System.

        --Joe


      • I think you are missing the point of what he said.
        Here is some more from the discussion:

        "It's "directed mutation" on a microscopic level, but there is very little
        macroscopic direction."


        and

        "I'd much rather have "brownian motion", where a lot of microscopic
        directed improvements end up pushing the system slowly in a direction that
        none of the individual developers really had the vision to see on their
        own."


        Certainly the smaller "details" are directed, but I think the point Linus is trying to make is that,
        from the perspective of where the kernel was at version 1.0 and where it will be at v. 5.0, its
        macro direction is ludicrous to try to predict/design/direct. So yes, its path is directed, but much
        more so in micro sense rather than the macro sense.

      • Your point 1
        Total bullshit. Linus said it is somewhat directed evolution, but evolution nonetheless which is why Linux has turned into (and continues to) something he had never intended.

        Your point 2
        Totally ridiculous. Linus made a simple loose analogy, and you are taking it WAY too literally.

        Your point 3
        WTF? This has NOTHING TO DO WITH ANYTHING. The fact that MS succeeds through aggressive marketing and business tactics has nothing to do with anything, AT ALL, period! How is this relevent to anything at all in the discussion?

    • his baby created billion dollars of wealth (at one point, at least)

      That's an interesting statement. It seems to presuppose that the only real kind of "wealth" has a dollar figure and is based on stock prices. The thing about high-tech stock prices is that they are based largely on speculation, especially for emerging industries. High-tech stocks are usually valued at 20 to 50 times their company's annual profits, though, of course, most dot-coms never actually made a profit.

      Free and open-source software kind of short-circuit the conventional model of "wealth". For things that aren't given away for free, their real value is measured by the number of people who are willing to pay a certain amount for it, and after the exercise of selling something is carried out, you count up the dollars and see how much "wealth" was involved.

      But things that are given away still have value, and there is still a virtual amount of money that all of the Linux users and businesses around the world would be willing to pay for it. There's no good way to count this up, but I have little doubt that it would be in the billions of dollars.

      If a tree falls in the forest and nobody hears it, does it make a sound? Of course it does.

      Some people might think that Bill Gates is less evil because he has donated billions of dollars to charities. Linus has donated billions of dollars worth of "wealth" to the world also.

      But then, there's also more to life than just "wealth". Wealth is only a means to achieving a high standard of living. Has Linux improved your standard of living? Has free software? Is there anyone who has ever used the Internet who hasn't made use of free or open-source software?
  • by LordOfYourPants ( 145342 ) on Sunday December 02, 2001 @09:31AM (#2643537)
    In all seriousness, would this article have been given a second glance if Linus wasn't involved? If I were to post a message saying "Hey, my friends and I were discussing the meaning of life after arguing about pencils, check out the log," I doubt a single editor on slashdot would have given it a 2nd glance. What kind of sick twist on celebrity worship is this?
    • So, if you and your pals discuss the meaning of life shortly after covering pencils, that's about as interesting as if the lead developer and inventor of the Linux kernel discusses software design? Um. That's strange, you'd think an argument as powerful as that should convince easily, but still I find myself not quite over on your side yet. :)
    • You're right ... partly. I wouldn't have given it a second glance, but that doesn't mean, that it wouldn't have been so interesting (or "+1, Insightful").

      I might not have read it if Linus weren't involved (I might have nevertheless, 'cause Alan Cox is involved ;-), but that doesn't mean that it's not good. It would have been worth beeing postet even if Linus weren't involved. If your and your friends discussion was equally interesting I'd love to read the logs.

    • You are right, it is a bit of hero worship, but is it also true that Linus Torvalds and Alan Cox are among the greatest programers in the world. Those of us with lesser skill can only gain from thier insights. Although I would not be interested in listening to you and your friends talk about the meaning of life, I would very much like to listen to Emerson and Thoreau or Freud and Jung speak on the subject.

  • by reachinmark ( 536719 ) on Sunday December 02, 2001 @09:33AM (#2643540) Homepage
    I think I agree with Linus.

    Can anyone really say that computing as a field or science was designed? What we have today is the result of a form of evolution and a result of a market economy. Nobody knew where we were going, we just started going someplace.

    The company I work for has spent the past 4 years slowly evolving a fairly complex graphics and haptic (see: Intelligent Scalpels Through Touch Technology [slashdot.org] for more about haptics) API. At the start we had only a vague idea of what it should be like. We knew from our experiences in graphics that it should be scene-graph based -- so we borrowed the VRML design. We knew that we wanted to be able to do a few things with it. This gave us the basic framework to start with, much like Linus had with Linux.

    Then we basically evolved the product. Every time we worked on a project that used the API, we learnt more about what it was good at and what it lacked. We modified it, fixed things, extended it with new features. After 4 years we have something far better than we could ever have dreamed of designing.

    The most important reason for using this approach was not because we believed in an evolutionary approach to software engineering (I don't think that Linus' advice should be taken too literally). It was because we were dealing with making an API out of cutting-edge research - much of which hadn't been done when we started. We simply couldn't have designed it.

  • by cygnusx ( 193092 ) on Sunday December 02, 2001 @09:36AM (#2643547)
    Reading this lkml thread, I had the distinct feeling that you could replace Sun with Apple in Linus' posts and much of it would still be true.

    You heard them above. Sun is basically inbreeding. That tends to be good
    to bring out specific characteristics of a breed...


    Following that thread, can I now propose Linus' Law:

    Any software system with a large enough user base can rely on the accumulated experience of its users to add features, and also picking ideas from smaller systems now and then (at a very low incremental effort).

    Corollary. The onus is on the smaller players to come up with new features to distinguish themselves from the masses -- but ultimately it's no-win for them because their *really useful* ideas will be subsumed into more popular systems anyway ... only a matter of time.

    I need sleep and I'm quite possibly not thinking straight, but am I right in thinking this would create enormous pressures for specialized players like Sun and Apple (and Be, as they found out) in the long term?

    If that is the case, where does that leave the "small is beautiful" rule? Does it mutate to "small is beautiful, provided you are part of a *big* idea that has incredible amounts of 'traction'"?
    • by Ami Ganguli ( 921 ) on Sunday December 02, 2001 @10:33AM (#2643618) Homepage

      I'm not sure the number of users is important so much as the number of developers/contributers. Or if Linus is correct, the number of developers with different agendas.

      In fact the whole debate starts to sound a bit like ESR's Cathedral & Bazaar.

      • Assuming developers listen to their target user audience (i.e., they don't write *just* to satisfy their itch), would the number of developers with different agendas not be proportional to the number of *users* with different agendas?

        User group A could be CS types who'd see nothing wrong with compiling odd-numbered kernels for breakfast and who drool over things like CML2. User group B could the Mandrake using types (or even Mac using types :)) for whom graphical installers exist. And so on.

        End result: a variety of users leads to a variety of solutions, which ultimately enriches the platform. One downside: there's (sometimes massive) duplication of effort (KDE/Gnome :))-- but hey, natural selection also works the same way. The only thing to guard against is: is one user group being a nuisance to the others?
        • would the number of developers with different agendas not be proportional to the number of *users* with different agendas?

          Not in general with proprietary software. A lot of people use Sun hardware for a lot of different things, but only one group of developers (or one 'agenda' assuming those developers are kept on a tight leash) actually gets direct input. So if Sun management decides that massively parrallel SMP boxes are where the money is, users who want to cluster a few hundred 2 processor systems together get less attention.

    • I now propose Linus' Law:

      Any software system with a large enough user base can rely on the accumulated experience of its users to add features, and also picking ideas from smaller systems now and then (at a very low incremental effort).

      Corollary. The onus is on the smaller players to come up with new features to distinguish themselves from the masses -- but ultimately it's no-win for them because their *really useful* ideas will be subsumed into more popular systems anyway ... only a matter of time. This also reminds me of what Judge Jackson described in the section of his findings of fact [usdoj.gov] against Microsoft: "Barrier to Entry"

  • "Survival" is a very clear term in biology, it means
    being able to keep yourself alive.

    What does survival mean in software terms? Does
    it mean that you make the most money (Microsoft?),
    that you get to have the most users?, that you
    endure in time and get written in textbooks?,
    that you show clear technical superiority?

    I think that any of these can be taken as
    proof of "survival" of a software project, yet
    the fact that MS-DOS lasted extremely long and
    became extremely popular cannot possibly
    mean that it is something we want to copy or
    admire.

    An argument that I would happily accept is that
    evolution exists in linux-world as the result
    of survival of different linux
    ideas/implementations (e.g. new VM, new
    low-latency etc) in the linux user subspace.

    Now, the linux users space is a group of
    technically aware people (?!) and evolution
    of different linux variants in that space
    can be said to be constructive in a technical
    sense, thus producing real progress.

    This process cannot universally guarantee
    software quality (from a purely technical
    standpoint)

    P.
    • What does survival mean in software terms?
      In the case of Linux, it means a piece of code gets Linus' blessing to stay in the kernel.
    • what does survival mean in software???

      --exactly what it means in biology. things that survive from a biological point of view are necessarily good or better. sometimes they are. sometimes they're not. humans survived because they were able to overcome certain hardships created by the world. But i dont exactly admire humans; if you read Ishmael by good old mr. Quinn, he clearly (as do I) dislike human nature, despite the fact that we can't avoid it. humans do the exact same thing that microsoft does: they kill everything around them, and take more than they need.

      There was one thing that Bill Gates did not foresee: the advent of a FREE os...something that he could not counter. the human race (analogous to M$) has killed everything, and eventually there will come a species that can not be killed off (in my opinion this will be the sentiet AI that I, err...i mean people will create). However, until there comes along something analogous to linux, humans will continue to dominate.

      QED
  • by nusuth ( 520833 ) <oooo_0000us.yahoo@com> on Sunday December 02, 2001 @09:44AM (#2643562) Homepage
    1- Get latest kernel source 2- Open a random file, go to a random location in that file 3- Roll a d100, use this table: 0-10 insert a random C keyword 11-20 delete nearest statement 21-50 define a new variable 51-80 delete a variable declaration 81-85 change your keyboard layout to some language, switch off monitor and start typing headlines of slashdot. Stop when you feel like it 86-90 delete rand(20) characters 91 delete file 92 merge file with other random 93 copy file with a new name 94 move file to another location 95+ merge file with a random C source from net 4- try building source. If all goes well,submit a patch. Otherwise roll d6, on 1-5 return to step 2, on a 6 return to step 1.
  • by shinji1911 ( 238955 ) on Sunday December 02, 2001 @09:59AM (#2643580)
    According to Rik:

    Biological selection does nothing except removing the weak ones, it cannot automatically create systems which work well.

    In short, I believe the biological selection is just that, selection. The creation of stuff will need some direction.

    And I have to nod vigorously to that. Even taking the model of accelerated evolution through human breeding of species: you direct two animals together to breed. You don't just let the Ps, the F1s, the F2s, etc. just all wander around in a pen, have a sniper sitting on a post shooting the ones you don't want, and hoping the rest go at it...
    • crap!!! (Score:3, Insightful)

      by koekepeer ( 197127 )
      bollocks, no way this is insightful!!!

      you don't understand the concepts of evolutiuon, and neither does mr. van riel.

      biological selection (actually, the terminology is "natural selection") does not work by weeding out the weak ones. natural selection favours the multiplication of successfull ones (ie 'survival of the fittest').

      the argument you (and rik van riel) are using, is essentially the same as most creationists use: mutation can only break down and not build up.

      this is wrong. read some darwin before you comment on this stuff please.
      regards,

      meneer de koekepeer
  • The people claiming evolution to be a process to slow for software development seems to miss out on an important point. Measurement of evolution speed cannot be carried out in years. Evolution must be measured in lifecycles. The number of lifecycles needed for a program/snippet to evolve is about 1-20 lifecycles (releases) and by multiplying this with the time it takes for one lifecycle to complete you've got an approximat value of how fast computer programs evolve.

    Another important point is that in this evolution - tough on som level about "survival of the fittest" - there is a certain level of continious "trial and error". This is in fact the way most programming - and learning - is done and this is done through the lifecycle. In real life, DNA can't remember actions carried on by their owners.
  • by seldolivaw ( 179178 ) <me@@@seldo...com> on Sunday December 02, 2001 @10:18AM (#2643604) Homepage
    Reading it, does anybody else get a strong sense of deja vu? It sounds like the two sides are arguing Evolution vs. Creationism -- well, they *are* -- but in this case they're arguing it over Linux instead of over human beings. Only in this case, we *know* there was a creator, and he says "I didn't create it, it evolved". Which makes me wonder if we ever did find the "creator" of human beings, and what would happen if he/she/it/they said the same thing about us :-) Picture it (and pardon my Eurocentricity):

    Us: "God! At last we have found you! Now tell us, please... WHY ARE WE THE WAY WE ARE? WHY ARE WE HERE?"

    God: "I dunno. I created you to eat the lions, and you just kinda got out of hand"
    • I had Fractint in indefinite-precision mode and was chasing down an especially interesting whorl at higher and higher magnification when I saw this little guy on the screen. He waved at me and said, "Hey, God, Woo-Hoo, I finally found you! Why did you create all this?"

      I was about to give him a really witty answer but the power blinked, and that was that. Too bad I didn't bother to record the co-ordinates :-(

  • Linus to Larry McVoy: Are you indispensable for the continued well-being of humanity? I believe not, although you are of course free to disagree.

    Don't take it too hard, Larry. Stay with us!

    • From: Rik van Riel
      Subject: Re: Coding style - a non-issue

      [...]
      Biological selection does nothing except removing the weak ones, it cannot automatically create systems which work well.
      [...]

    Since Linus is comparing biological selection to the way things work in Linux, these are ironic words coming from Rik. :)
  • Oh well...

    the problem with that witty finsk is that he appaprently was forced to endure a few real bad CS classes back in Helsinki.

    He's wrong, of course. Whatever works in Linux works because at some point somebody did some serious thinking before starting to spew out code. Planning data structures. Maybe even read about how others tackled the problem.

    Thats called Design. In a few areas Linux serously lacks design. and it shows.

    f.
    • Don't confuse strategic design with tactical. Oh dear, another bad analogy!

      I understand Linus to be saying that he didn't forsee things like the iPaq or the OS/390 port when he started his terminal program in '91. That's strategic. That doesn't preclude him from designing an API or a data structure. That's tactical.

  • OK, linux is evolving (=changing incrementally) and not controlling tightly how it evolves is a nice idea, but this is how far the analogy goes. Linus is taking the analogy too far and use biological evolution out of its context. People do design pieces of code they submit, linus do control which ones are released in the main tree. Both of these facts, especially the latter one, make evolution of linux fundementally different than natural evolution. If you agree with linus please carefully state what do you agree with. Do you agree that any complicated engineering project can not be designed in advance? Or the fact that linux is not particularly directed to a defined goal is a good thing? Or natural evolution is a proof that not designing linux is a good idea? I agree with first two, but third one is plain wrong.
  • When discussing the need for proper 'scientific' design, Alan Cox said:
    "Engineering does not require science. Science helps a lot but people built perfectly good brick walls long before they knew why cement works."

    To me, this seems to be a very poor analogy. The fact is that before the widespread use of maths and materials science in structual engineering ('building a good wall'), structual engineering didn't really exist at all - there were just builders and designers. 'Engineering' only really began when the science was added; before it was an art or trade. As for building a 'perfectly good wall', yes, the walls did indeed usually stand up; but:

    a) Not always. Take the case of medaeval cathedrals. In order to stop the weight of the roof pushing the walls apart, the walls had flying buttresses built for support; however in some cases the buttresses actually were so big that they collapsed the wall in the other direction!
    b) Not very efficiently. Due to the builders being unable to optimise their design, buildings were often very wasteful of materials in design.
    c) How many medaeval skyscrapers were there? You just can't build many of todays huge structures without 'sciencey' engineering.

    All in all, I think Alan would have been better advised not to compare it to building a wall; the problem is more that an operating system has such wide scope and enormous complexity (due to different areas of code affecting each other), as well as being flexible enough to change over time, that it isn't feasible to desgin the whole system as you would a dam or skyscraper.

    Chris Cunningham
  • Just how many new SIGs are going to come from this one thread?

    -Spackler

    I'm not claiming to be deep, I'm claiming to do it for fun. -Linus
  • The success of a project is not the sum of design plus evolution, where enough of one can make up for too little of the other. It's more like the product of design multiplied by evolution: if either is too small, your project goes nowhere fast.
  • O.K. My $0.02 (Score:5, Interesting)

    by renehollan ( 138013 ) <rhollan AT clearwire DOT net> on Sunday December 02, 2001 @12:15PM (#2643842) Homepage Journal
    So, software evolves... it isn't designed?

    Sounds like a couple of harsh extremes to me.

    Of course software is designed. But this does not mean that the design is complete, correct, or optimal. And that's where evolution comes in.

    All these people who scoff at formal design do have a point: so many times so called formal designs end up being one way paths to the wrong thing.

    The formal design advocates repond by saying, "well, you didn't have a correct design." A fat lot of good that does. I've been part of development teams where there is this mantra of design it, check it, double check it, lets not do anything until the design is complete, because failure is uncorrectable. And you end up progressing e v e r s o s l o w l y. This is design by perfection -- the idea is to be so careful about the design that it can't be flawed.

    Of course, this never works. Nobody can make anything non-trivial right the first time around. It requires some kind of step-wise refinement. Now, this does not mean the design should be abandoned, but one should design in anticipation of making mistakes. Then, the design permits the local correction of errors, without them becoming a global fiasco.

    Design for flexibility then: separate APIs from implementations. Version your APIs so when they're lacking you can produce a new back-compatible version. Don't know all the details about every possible kind of device? Gee, throw in an open-ended IOCTL into the device control API. Refine IOCTLs for similar devices later, when we figure out what they need besides the basics.

    The point is that it is possible to design adaptable and refinable systems in order to accomodate the inevitable "opps" with a fix that is local and not global in nature. Now, you can't be flexible in everything and sometimes correcting things hurts: witness the Linux VM. It wasn't really planned to abstract it's API away to allow for interchangable plug-ins, was it. And the VM wars were somewhat painful precisely because one had to chose and couldn't punt.

    Nevertheless, experienced software designers try to provide an "out" whenever they can, and think that a particular course might require modification in the future.

  • The whole thread makes me ill. Many projects are designed up front -- the basic feature set, the UI, the object interfaces. It's a shame they did not put more emphasis on this reality.

    [Linus]

    > Quite frankly, Sun is doomed. And it has
    > nothing to do with their engineering practices
    > or their coding style.

    It may have everthing to do with evolution, but only because the baby is growing up, and the engineers have little or nothing new to toss out of the womb.

    Sun made middleware happen; now MS is cloning it and taking their one big chance. Less evolution and more population.

    Sun and SGI could have made small fairly inexpensive game cubes years ago -- cubes that could have doubled as engineering workstations or even clustered. They chose not to, going for the server market. Neither has seriously approached the asian manufacturing giants. Poor thought processes up front!

    IMHO, of all the Unix giants, Irix and SGI had the best chance to make it big -- Irix, for all it's flaws, did the best job ever of hiding Unix, and they did it many years ago. Too bad they dropped the ball and failed to make a consumer device with the help of asia.

    MS rises, and continues to rise. It may be evolution in the end, but it's the overwhelming size of their population, not the superiority of product.

    Add in the failure of the USA to enforce it's laws, add in the poor strategies of the big iron Unix corps, and there you have it. Little evolution, since their was never a competitive population.
  • really? (Score:2, Interesting)

    by DrSkwid ( 118965 )

    If you want to see a system that was more thoroughly _designed_, you
    should probably point not to Dennis and Ken, but to systems like L4 and
    Plan-9, and people like Jochen Liedtk and Rob Pike.

    And notice how they aren't all that popular or well known? "Design" is
    like a religion - too much of it makes you inflexibly and unpopular.


    I hardly think that plan9's unpopularity is down to that fact that's it's been well designed!

    working in it is a joy. It suffers from lack of a good web browser (not exactly a small undertaking) and 23 char filenames (wave bye bye to those ream soon now [tm])

    but I guess not everyone likes design. I'm sure more ppl reading this are in an untidy hell hole of a room. If you've not got some dirty crockery in reaching distance of you then I doff my hat to you.

    but good design brings pleasure, and working with plan9 brings more joy than frustration.

    linux is winning not because it's a great piece of software but rather one of those historical flukes of the right place at the right time and captured people's imagination. Feeding my pc with my first slackware floppy disk set was liberating and discovering the joy of hitting co-operate rather than default has justly brought it's reward.

    but hey, come on, keep your mind open. there's always a spare pc lying around, spend an evening with somethign else for a change.

    http://plan9.bell-labs.com/plan9
  • I have tremendous respect for Linux & co. with regards to software development, and it is always nice to see people who are not philosophers (or biologists) discussing philosophy (and biology).

    However with respect to their opinions on philosophy (and biology), they are, as a previous poster commented, quite undergraduate. Actually I might be inclined to say worse about them, as I am self educated beyond High School, and I am aware of a much broader world of philosophy (and biology) than they seem to be.

    Actually, it reminds me of nothing so much as Alan Cox' posturing on the DMCA, where my opinion was that people who do not understand such issues at all should refrain from making lawyerly or political comments in a broader public form where they have respect that is not meritted for the comments they are making.

    While it is nice to see these people expressing interest in broader topics, I feel that they should keep their public discussions to the issues of which they have some understanding, namely software development. All that can come of their ponderings otherwise is to spread their ignorance further than they already have.
    • I am aware of a much broader world of philosophy (and biology) than they seem to be

      I don't disagree, but I'd be interested to hear where you think they are being short-sighted.

      I feel that they should keep their public discussions to the issues of which they have some understanding

      Is it really that public? I doubt the linux-kernal ML is that widely read, and it's not like these guys *asked* for their conversation to be posted on slashdot.

      I'd say that if there are philosophical and biological issues that are relevant to kernel development then the kernel hackers have a right to do their best to hack through them with their limited knowledge. It's not like a Ph.D. in this subject is going to post in lkml about this stuff.

      If I only spoke about what I was an expert on, I'd never speak, and I'd learn much more slowly as a result. Sometimes it is best to hold your tongue. Other times you need to put your half-finished thought out there because the community needs it to be finished and you can't do it yourself. And sometimes even the experts get so comfortable in their field, they don't see a good, new idea until an "ignorant" suggests it as an under-understood idea.

      -Erik
  • Design vs. Evolution (Score:5, Interesting)

    by Fnkmaster ( 89084 ) on Sunday December 02, 2001 @01:29PM (#2644019)
    Okay, perhaps I'm stepping out on a limb, but this thread is already jammed so nobody is likely to read my post anyway.


    I didn't read the whole thread rant with Linus et. al. - but from my own experience and observation EVERY successful project mixes both initial design and evolution in design AND implementation. If you fix the design absolutely up front at both the macro level AND of every sub-system in a large project, you will invariably run into huge roadblocks at some point. Something will not work as planned. As I see the Linux Bazaar process, it reaps benefits when this happens - some person or organization stumbles into a roadblock with poor networking code, poor SCSI subsystem behavior at high loads, or an unreliable VM. These emergent behaviors may only affect some small portion of the user base - but the subsystems then enter an evolutionary phase where people varyingly fix what's there or design something new, and some design ends up surviving based on what the most people seem to like and want and in the end, if all else fails, what Linus dictates.


    So no, this isn't strict "evolution" after the style of Darwin. If we let purely random decisions drive software and forked every few minutes, the analogy would be pretty complete. It would also take as long to write good software as it does to evolve a well adapted creature. An eternity.


    I see where the idea of selective breeding comes in - Linus sees himself and the kernel leading guns as picking and choosing the best patches and suggestions. Up to a point, this means they are exercising design and discretion, but they generally don't "assign" work from their central database of TODO tasks to IBM, Red Hat, and other individuals or organizations participating in kernel development - those organizations and individuals scratch their own itches and their work usually finds its way back into the kernel. Other posters accurately said that a more random evolution could be effected by letting people check in free-for-all into CVS. This is true, but I don't think that would necessarily improve the results and timeline of kernel development.


    You have to realize that the comparison here is, as others pointed out, to a monolithic software development process - in the Cathedral, a centralized decision is made - "we are going to make Windows NT better able to support large enterprise database deployments" and a team is assigned to break it down and work through all the implications, then implement. In Linux-land, the interested parties don't call to schmooze with MS biz dev people who pass info down to technical guiding councils, they pony up and write their own patches to the subsystems they see that need improvement. If there are enough interest parties, presumably enough patches will get submitted that the best from all get incorporated into the set of relevant subsystems that effect large enterprise database deployment, and we end up with a Linux kernel that supports exactly that. Of course the primary difference is that at the same time, somebody else may have made complementary and/or conflicting changes to make Linux a better desktop OS. Chaos ensues and flames erupt on kernel-dev and wonderously, eventually, something better for everyone results after compromises are made.

    • By the way, I failed to note a failure significant point - I've worked on projects that decided to design monolithically with a design dictated by a single megalomaniac who insisted he dictate the whole design even though he obviously didn't understand the consequences of every piece of the system. The result? A product that doesn't really work and does a poor job for the original primary target market and is not flexible enough to be retargetted at the new primary market (and the funny part is that most of the over-engineering was specifically with the goal of making it 100% flexible). Overengineering is even worse than underengineering - underengineering will probably get you to a mediocre solution, but at least you didn't waste lots of time getting there and you can throw away the pieces that suck and rewrite them.
  • Software evolves, as Linus says, but many experienced software developers seem to struggle to understand this. I suspect that part of the problem is with the term "maintenance" as it applies to software. With software, maintenance means something different from what it means in engineering.


    In engineering, maintenance is performed for one purpose: to achieve homeostasis. For example, a building is maintained so that it remains standing, etc. With software, maintenance consists of homeostatic things (bug fixes), but also of things to enhance, or change, functionality. You would never add new storeys in the middle of a high-rise building, or modify a jet fighter to carry large amounts of freight. Yet changes like this do occur with software.


    And they always will occur. Or at least they should: software that is not receiving change requests is software that is dying. Be glad for those requests. And don't complain when users change their minds, or don't really know what they want. Users are people. In geek-speak, this means that they are not reprogrammable: you must deal with them as they are.


    When developers really accept this, they tend to accept that the correct paradigm really is evolution. I dream that more advocates of the engineering approach to software will someday be among those people.

  • by jonabbey ( 2498 ) <jonabbey@ganymeta.org> on Sunday December 02, 2001 @02:39PM (#2644162) Homepage

    I agree with Linus.. projects that I've spent several years on came out at the end with features and design elements I could never have predicted going in. I've spent 6 months doing design work on pen and paper at the start of a project, and during the years of implementation thereafter, far more 'design' was done by reacting to the state of the code in any given moment and the problems it was having both internally and with regard to the userbase. My biggest project has evolved tremendously, even though I was essentially the only coder working on it for most of its existence. I can't imagine, then, how much less 'designed' by any individual the linux kernel must be, with the hundreds or thousands of developers contributing to it.

    On the topic of Sun's doom, I understand why he says that. Sun's software is co-evolved with their hardware, but neither change very quickly. Linux has to cope with a much more wild, much more genetically diverse hardware base, and as a result it tends to move faster to support new types of devices. Solaris on Intel is a joke compare with Linux on Intel in terms of its hardware support.

    Of course, there is nothing magical about a process that allows more evolutionary freedom.. if the hackers working on it don't have the good sense to be effective natural selectors and mutators, then the process won't have a terrific outcome. Linux is thriving because it has so darn many hackers working on it, and because it has so very, very many users using it, and because Linus has a deep and proper understanding of both good taste and evolution.

    • Solaris on Intel is a joke compare with Linux on Intel in terms of its hardware support.

      Actually, it can be a bit of a joke on SPARC too - I have a SunBlade 100 sitting next to me that ships with a smartcard reader by default. And a note saying that, er, sorry about that, we don't have any Solaris drivers for the Smartcard reader we included yet. As far as I can tell, there still aren't, and this models been out for months.

      However, I've just heard from a friend that if we were using OpenBSD 3 on the Sunblade, et voila, working smartcard reader. :)

  • Why do so many people read Linus comments and try to simplify them further missing the point completely? Obviously When Linus says that the Kernel evolves through "sheer luck", he is not trying to say that the changes made to Linux were not intentional. Each individual believes his change is good and necessary, and many are... Linux and his support staff, like Alan, are there to audit the changes and ensure that the ones that matter get in. In the end, however, each change is not what moves Linux forward at the pace it does, it's the fact that moving all of the bits around finds "lucky" combinations that create sparks of genius which create new intentional changes that make some of those earlier changes, that at the time seemed like the most important, to become irrelevant in the light of the newest revelation.
  • But that was all I could take. Still, a rather good philisophical discussion on software evolution. Now time to think of other things... hehehe

    -Restil
  • It is a mistake to believe that an individual who is experienced and gifted can't beat a group of similar individuals at design of a system that may need strong application of Occam's Razor.

    Individuals are great for design -- particularly if they have some other individuals with whom they can communicate well for reality checks during design. Consider Seymour Cray's designs [ucsf.edu] -- not very complex by the standards of today's computer systems, but Cray's ability to pick a team and then listen well combined with his individualistic design habits led him to beat IBM's army of well funded PhD [geocities.com]. The problem with individuals is that there is a natural limit to the complexity that an individual can fit in his head -- where the internal bandwidths of an individual's mind are enormous enough that engineering tradeoffs can occur at rates vastly exceeding those allowed by the bottlenecks of verbal and/or literate communication.

    Similarly, it is a mistake to believe that once a gifted individual's limits are hit that a group of gifted individuals are going to be able to beat a broad evolutionary process in advancing the design.

    That's why the gifted individual designer's first and foremost design goal should be to maximize the evolutionary flexibility of his design -- so that the advantages of individualist design are maximally leveraged before complexity dictates that distributed evolution dominates further design.

    PS: As for Torvalds' understanding of evolution and breeding -- he underestimates the importance of niches. It is precisely the ability to fill niches that makes an evolutionary system viable. Consider, for instance, sexual reproduction's tendency to, upon encountering the periphery of ecological ranges where population is sparse (like, ahem, Finland) automatically inbreed and therefore express mutations -- most of which fail, of course. The point is that without expressing those mutations the advantages of new genetic patterns can't translate into population increases at those peripheral ranges. Linux isn't a good example of this, since UNIX was a well-populated "ecological range", so Linus should take care not to generalize too far his insights derived therefrom.

  • Sounds like Linus is in accord with Dick Gabriel's "Worse is Better" essay [dreamsongs.com], or rather the "New Jersey" school of design. The "worse" package gets out quickly to a place where "natural [user/hacker] selection" can work on it.
  • I don't think I agree with the full analogy. Obviously Code is not randomly generated or selected for mutation. We use intelligence to know what's wrong, then we use knowledge to improve. What really evolves in good software is the design, not the code. Sometimes you have to start from scratch again to implement design changes.

    We could probably design a new, better human, but sheer evolution will _NEVER_ result in perfection. Design can perfect many small peices of code. Combining these smaller pieces, one can achieve near perfection in a lot less time than sheer luck. Evolution produces local minima's, where design can find the absolute minimum error, and move toward it much quicker. (Think if multiple layer perceptron networks.)
  • Powerful words (Score:3, Insightful)

    by ftobin ( 48814 ) on Sunday December 02, 2001 @06:46PM (#2644796) Homepage

    I must say, the following quote from Linus from the article is one that strikes me with fear and awe:

    Try to prove me wrong.

    When someone with the prestige that Linus has says something as powerful as this, I cannot help but feel that this topic is something that he is absolutely passionate about, much in the same way Stallman is passionate about Free Software. Linus doesn't seem like the type of person to use this sort of phrase on a whim; like he says, "I'm deadly serious".

  • The challenge is to come to terms with the fact that the bulk of humanity will never see the deep truth in what he is saying.

    The mythology of design is pervasive but just plain wrong. Design only ever happens in marginal increments. Quotes about standing on the shoulders of giants come to mind.

    A deeper challenge is that most people are incapable of understanding evolution, not because of any lack of inherent intelligence but because they haven't ever gotten out of the comfort zone.

    An interesting but neglected mid-80s paper by Marcia Salner, then at the Saybrook Institute in San Francisco, but now at the University of Illinois at Springfield, as I see following a Google search I now need to spend some time following up on, pointed out that it helps greatly to have got through some genuine crises, firstly to break our naive and seductive faith in the universality of right and wrong answers and secondly to force us to look beyond the naive relativism which first replaces the right-wrong dichotomy.

    Evolution, be it biological, social, technical or whatever, is about what works in practice, and even more so about the uses made of its products, because evolution does not happen in a vacuum. (Yes I am using "vacuum" metaphorically. The real vacuum of 3D space is also highly evolved.)

    Now I find myself caught up with the even deeper challenge that if too many people actually believed what Linus is saying that the whole system would collapse. It seems only possible to build viable social institutions on rhetoric that does not stand scrutiny.

  • Left-corner design (Score:5, Interesting)

    by steveha ( 103154 ) on Sunday December 02, 2001 @07:16PM (#2644898) Homepage
    When I was in college, I read the book Software Tools in Pascal by Kernighan and Plauger. The most valuable thing I learned in college was the system of design set forth in that book, which the authors called "left-corner design".

    The idea is simple: when creating a program, start with the most important thing the program needs to do. Once you have that working, add more features. Ideally, as you go, you should be releasing working versions to whoever will be using your program.

    This is so right in so many ways. For one thing, if you run out of time during a project, at least you have something you can release, and it may very well do much of what the users need. (There is a line in the book to the effect of "80% of the problem solved now is better than 100% solved later.") Also, early feedback from the users can show you what's wrong with your design, before you write a whole bunch of code that you would later have had to rip out. (I seem to recall an example in the book where a large system spec turned out to be totally wrong; the users didn't know what they wanted until they had something to play with.)

    I never before noticed that the standard open-source development techniques match up with the left-corner methodology. Open-source projects such as Linux are all about "release early and often".

    When I read Linus's comments, I was nodding my head all over the place. You create some code that solves some problem, possibly not very well. You release it. Feedback and patches start to arrive, and the code grows, possibly in directions you never foresaw. The more popular the code gets, the more robust it gets, as people patch it to work in a wide variety of situations and on a wide variety of hardware. This is why Linux has come so far, so fast.

    steveha
  • Only thru _TIGHT_ CONTROL and superior foresight will quality software be written. Without a verbose design, structured development process, perfect testing procedure and most importantly superior direction, software will fail to be of any value to society.

    This universe might be based on pure uncertainty (as shown in quantum mechcanics) but we as observers are completely outside of this random system and must structure this randomness into something consistent and predictable thru solid design. If not, progress will be slow and fraught with failures (evolution). Evolution is slow and seeing as we are of the universe, yet outside of it (we can observe), we need not be restricted by it's short comings.

    Thru perfect top down control we will write perfect software which is second to none!

    On the other hand, maybe function precedes form. Maybe it is better to focus on the task at hand, allowing external events to dictate the direction, then to separate ourselves from the environment we are tending, after all, we create things be _useful_ and not just to be used :)

    Maybe.. just maybe there is a point where control is harmful and hinders progress, maybe.. just maybe.. progress is unavoidable.
  • and all I can tell is from my G4 Darwin fucking rules!

    Linus sounds tired and irratible. I think that the adoption of his idea and the media's desecration of "it's itent" have gotten to Linus to the point that he's comparing hiself to God or Godlike figure who's created a thing that evolves according to natural, hence random, process.

    I hope he gets the rest he needs.

    Meanwhile go get yourself a Mac this OSX is what I was thinking about when I installed RH5.1 all those years ago ('96?)
  • That's all design really is. You think it's about the Software Lifecycle, and writing specs until you're blue in the face? You're talking about paradigms of labour and documentation. But make no mistake - the act of design is the act of making decisions. Sometimes hard ones, sometimes easier ones. Some more important than others.

    And I submit to you that design is inherent to evolution.

    Evolution, in my view, is a process comprised of two cycling stages, as others have pointed out. Mutation is a random process, as random events cause (perhaps a number of) individuals in a species to develop a new trait. Selection is a process of deciding which "mutants" are able to reproduce and propagate.

    In biology, is there decision-making in mutation? Depends on what kind of mutation. If a gamma ray snips a DNA molecule, there's no decision made there - it just happens. But decisions can affect mutation. DNA researchers and biologists create mutants in labs everyday. And as a society, we've accepted a technologically advanced quality of life that we know affects our environment and in turn affects us. What goes around comes around.

    Decision making takes a more active role in selection and propagation. In anthropology, we measure evolutionary success generally by the number of viable offspring produced by the variation. That means that a successful variation of a species in a world of scarce resources (such as food and useful time) manages and allocates its resources in such a way that it is able to have more children than other variations of the species and thus have more influence on the future direction of the species. Successful management requires successful decision-making. Just try to manage without making a decision and you'll see. It doesn't matter if radioactive spiders turn whole packs of dogs into super-intelligent beings able to telepathically move fire-hydrants and build solid-gold toilets to drink out of - if those dogs decide to spend their time doing that and never have any puppies, they're an evolutionary dead-end. This is actually an issue that's been discussed in Anthro...people we see as being more more successful in our society are having fewer kids than less successful people...anyhow, we see that decision-making (and thus, design) is not mutually exclusive to evolution and in fact plays a large role in it.

    In software, mutation could be described as a change to either the source code of a software "component" or the configuration of a collection of software components. Any such modification is a mutation of software, whether intentional or not. Most changes in software, for good or ill, are intentional. Some are caused by gamma rays hitting storage devices and flipping bits, but more are done on purpose as an act that serves some purpose (bug-killing, optimization, etc.) So there is a decision there to serve the purpose via change. There's also a decision to either let a modification stand (because it serves the purpose, or because reversing the change is not worth it), or to revert to the pre-modification state. The decision is there even if it's only to ignore the issue. Decision making and, by extension, design is present in the selection of software changes. You cannot separate design from software evolution, because you cannot separate the evaluation and decision making process from the software development process. Doing such a thing about amount to putting a million monkeys on a million consoles banging away and hoping Linux 3.0 magically results. Statistically it could happen, but animal control would have a cow.

    Linus originally decided to go with Rik's VM code for 2.4, then later switched to Andrea's code. Neither move was decided by a coin toss. Evolution? Yes. Design? Yes. It's both, and why can't it be both?

    I'll finish by quoting from "Modern C++ Design" by Andrei Alexandrescu (page 4):
    For any given architectural problem, there are many competing ways of solving it. However they may scale differently and have distinct sets of advantages and disadvantages...

    Designing software systems is hard because it constantly asks you to choose. And in program design, just as in life, choice is hard.

    Good, seasoned developers know what choices will lead to a good design. For a beginner, each design choice opens a door to the unknown. The experienced designer is like a good chess player: She can see moves ahead. This takes time to learn.

Outside of a dog, a book is man's best friend. Inside of a dog, it is too dark to read.

Working...