The Evolution of Linux 356
Taiko writes: "Kerneltrap.org has posted some of the more interesting messages from a recent kernel mailing list discussion. It started with a post on proper indentation, but turned into something a bit more. There are some posts by Linus and Alan Cox about the nature of design, computer science, Linux development, evolution, and more. Quite interesting and funny."
Co-Evolution of Linux and AI (Score:2, Insightful)
The role of Linux in the history of computer science will turn out to be that Linux kept the Open Source model _open_ on the inevitable pathway to Technological Singularity. [caltech.edu]
Take for example the latest hot Linux gadget, the Sharp SL-5000D Zaurus PDA for Developers [sharpplace.com] which runs both Linux and Java, and is therefore an appealing platform for the further development of Mind.JAVA Artificial Intelligence [angelfire.com] in the Linux environment -- everyman's last great hope of avoiding a catastrophic Microsoft take-over of the 'Net.
The world owes a lot to Linus Torvalds, Richard Stallman, Eric S. Raymond, Tim Berners-Lee and the countless other heroes of the Open Source futurity either posting here on SlashDot or toiling messianically away in obscurity.
Kansas school board rejects Evolution (Score:5, Funny)
This will not do good for the acceptance of Linux in the Bible Belt [chick.com] -- Linux evolved through natural selection [literature.org], while Windows was created by God [microsoft.com].
Re:Kansas school board rejects Evolution (Score:2, Informative)
The big difference between natuaral selection and evolution is that in natural selection you lose information (genes that aren't needed in that environment become the minority) Where as evolution says that we make information from nothing!
If you look at linux you can see that PEOPLE are making it then natural selection is killing off the parts that aren't perfect. If you take out the people that make it then linux doesn't exists.
A better example would be that you bought a brand new (clean)computer and turned it on and off 1 billion times and expected it to boot in linux/dos/windows because the bits might randomly produce an operating system. When you look at it this way it sounds crazy.
Basically the point I'm trying to say is that every operating system was made by a creator(s) and then refined by natural selection. I don't mide when people have opposing views as long as they are informed views.> Evolution
NOTHING -> SOMETHING -> NATURAL SELECTOIN + A LARGE NUMBER -> US
Linux
LINUS + PREVIOUS CODE -> NATURAL SELECTION + CODERS + LINUX TODAY
There is a big difference.
Chaos works! (Score:2)
[back on topic] What a wonderful exchange of ideas! Linus has me convinced. OS/free software is about chaos (as in math). It's about an infinite number of monkeys on an infinite number of keyboards. Sure if you've got a small team and a tight time-frame you'll need to have tight control over the project. But linux is great because of the chaotic (as in math) processes which produce it.
Great stuff! (Score:5, Insightful)
I enjoy reading Linus' thoughts so much.
All around him, people try to make him or Linux more than it really is - and invariably, Linus brings it down a notch and puts it in perspective
It's amazing that this guy gets constantly hero-worshiped, his baby created billion dollars of wealth (at one point, at least), and yet just keeps his feet firmly planted
Compare that to the clowns that get high and mighty because they rUleZ at Quake, or on some IRC channel ... The geek community could learn a LOT from trying to emulate Linus' behaviour.
Re:Great stuff! (Score:2)
The geek community could learn a LOT from trying to emulate Linus' behaviour.
I'm trying, but when I tell people I don't care about Microsoft they call me an elitist. Well, actually I am an elitist, but what's that got to do with it?
Unrelated observation: stupid reporters ought to stop churning the same stereotypical questions and start reading the kernel mailing list if they want insight on Linus. His posts on the list are SO much more meaningful than what you can read in your standard Linus interview in Wired magazine.
Re:Great stuff! (Score:4, Interesting)
Let's go through Linus' claims:
Re:Great stuff! (Score:4, Interesting)
My turn to bite :-)
Linus is saying that Linux is evolving through countless small decisions. There is no One Big Plan. There are just ideas that are thought of. There's lots of code that is written. Some of it get into the kernel, some doesn't but gives new ideas for better design.
If you'd read through the whole discussion, you'd notice how computer science was compared to alchemy. It's a young science that has years to go before genetices can be applied. And we still have years and years of research that has to be done before we understand genetics. There's lots of trial and error being done there too.
Actually Windows has evolved a lot. Just look how much it has changed since Windows 3.1. It sure succeeds because of marketing and aggressive business tactics but that's only helping. MS wouldn't be able to compete against Linux with only Windows 3.1 no matter how aggressive they's be. So don't underestimate the effort behind developing Windows.
Actually Windows evolved towards what people wanted in the nineties. But since Windows 98 it has also had Word Domination-plan which is not good for Windows in the long run. But as Microsoft has loads of cash, they can afford trial and error as long as they learn from their mistakes. And that's one thing they are good at.
Re:Great stuff! (Score:2, Insightful)
If Linux wants to be like Windows, that fine. Windows stands for 'good-enough' and mediocrity.
Re:Great stuff! (Score:2)
I would say that the Unix skeleton that Linus so quickly dismisses counts as a One Big Plan. Look at all the horrible legacy that Linux carries from that: ports under 1024 require root privileges to bind, the entire ACL system, "everything is a file", mount points versus "drives", and a million other things.
If Linus' view of history was actually true then we wouldn't be seeing any of those things. We would instead see something that was hodgepodge of Windows, Plan 9, Unix, BeOS, and other operating systems' features. That is not what Linux is today. Instead it is something Linus drives forward with his One Big Plan of a Unix-a-like Operating System.
His evolution-centric view also ignores the fact that society at large mitigates against natural selection. When was the last time you saw someone being selected against because they didn't have 20/20 vision? And the choice of an operating system takes place in exactly that same society.
Re:Great stuff! (Score:2, Insightful)
"Look at all the horrible legacy that Linux..."
Yes there are problems with linux, but they do not constitute a plan, it's just they are better than anything else submitted. Just for the record mount points are so much better than drives, IMHO.
"We would instead see something that was hodgepodge..."
Linux is a hodge podge, just like Unix was. If some radical, and I mean radical, change in computing came along do you think Linux would not adopt it?
"When was the last time you saw someone being selected against because they didn't have 20/20 vision?"
Pilots for a start, then it depends on how severe the disability, blind people don't drive. Colour blind people cannot become electricians. Deaf people do not make very good opera singers etc etc etc
Re:Great stuff! (Score:2)
Compared to most commercial Unixes, and even to most other free Unixes (*BSD) Linux technically lags behind, is less advanced. Linus may claim tha Sun is dead (I'm not so sure) but compared to Solaris Linux has a long way to go.
However, Linux came at the right time, at the right place, and amongst others politics helped it (IBM and others found it good to use it as a weapon in the battle against Microsoft), and it had the luck that its competitors, the better designed NetBSD and FreeBSD, were having licensing problems at a crucial period in time.
Re:Great stuff! (Score:3, Insightful)
Linus claims that Linux has no guided direction and developes purely through evolution and luck. First off, I would view this as highly insulting if I was a major player like IBM, or even someone who has only minimally contributed. He is basically saying that these people are as useful as a random code generator. Even more importantly, his statement is not true.
I don't think he was saying that at all. In IBMs case they have a direction they are pushing and develop code in that area, but at the same time there is a vast array of people taking it in different directions. The net result is unplanned (as opposed to random).
The analogy to selective breeding is wrong. Yes, we can speed up evolution through selective breeding, but we are only changing minor traits. Sure, you can breed a dog with long hair, a short snout, and good temperment, but what if you want to breed a dog with feathers, or a fifth leg? At the very best, that would take an incredible amount of time. The better solution is to research and apply genetics. Lets apply that to the kernel... we can either let the scsi mid layer slowly evolve into something useful, or we can sit down and give it a good design phase and have something that works in a much shorter period of time.
Firstly, the lifecycle time of the kernel is down to a few days instead of years - secondly things do evolve - just look at the progress of the VM (either one). First attempt didn't get it quite right, so then there are some patches and things get a bit better, but something else is bust (etc etc). This seems quite close to the breeding approach (but is only one of a number of parallel directions for the kernel).
Windows does not succeed because of evolution and a deep gene pool. Windows succeeds because of 1) marketing, 2) aggressive business tatics against competitors, and 3) it's not so buggy that it's totally unusable.
I don't think Windows succeeded because of evolution either - it was a major mutation which occurred at a certain lucky point in history and wiped out most of its competitors. [Sure marketing and business tactics helped - but the real winner was the GUI interface (against DOS) and the fact that apple didn't open up their hardware whereas IBM did]. Don't we have something like this in the natural world? [My brain is addled with the thought of the mutant in Asimov's Foundation series]. The big question though is - in the long term will it continue to evolve fast enough to keep up when pressed with alternative species (like linux and the speed with which it is evolving!)
Re:Great stuff! (Score:2)
Actually, IBM sued Compaq for reverse engineering the PC BIOS and creating clones. Compaq just happened to win. It's not so much that IBM opened their hardware as that it was opened in spite of them. If Compaq had decided to clone Apple machines instead, the story might have been far different.
Re:Great stuff! (Score:3, Insightful)
The main point you missed, though, is that Linus is right at the macro level -- there is no overall design process for Linux as a system or an overall direction for Linux.
At the kernel subsystem level, there's plenty of design, and plenty of goals, and plenty of localized direction. In the filesystem space, there was a lot of buzz around journalling filesystems. In the MM department, we had something more akin to controlled chaos... :-) And yes, the SCSI layer could use
some actual careful design work.
There was no overarching goal "We must optimize for market X" that drove any of this. Sure, some people want to run Linux on huge machines, and so they want journalling. Other people want to shove Linux into wristwatches and PDAs, and so they instead want to focus on memory footprint. And still others care about interrupt latency over throughput. So, each little care-about niche as their own little projects that pull Linux in lots of different directions at the macro level. Each individual project is very directed, and some have significant design work. But none of it is directed from On High as part of the Grand Plan for The System.
--JoeRe:Great stuff! (Score:2)
I think you are missing the point of what he said.
Here is some more from the discussion:
"It's "directed mutation" on a microscopic level, but there is very little
macroscopic direction."
and
"I'd much rather have "brownian motion", where a lot of microscopic
directed improvements end up pushing the system slowly in a direction that
none of the individual developers really had the vision to see on their
own."
Certainly the smaller "details" are directed, but I think the point Linus is trying to make is that,
from the perspective of where the kernel was at version 1.0 and where it will be at v. 5.0, its
macro direction is ludicrous to try to predict/design/direct. So yes, its path is directed, but much
more so in micro sense rather than the macro sense.
bullshit (Score:2)
Total bullshit. Linus said it is somewhat directed evolution, but evolution nonetheless which is why Linux has turned into (and continues to) something he had never intended.
Your point 2
Totally ridiculous. Linus made a simple loose analogy, and you are taking it WAY too literally.
Your point 3
WTF? This has NOTHING TO DO WITH ANYTHING. The fact that MS succeeds through aggressive marketing and business tactics has nothing to do with anything, AT ALL, period! How is this relevent to anything at all in the discussion?
Re:Great stuff! (Score:2)
That's an interesting statement. It seems to presuppose that the only real kind of "wealth" has a dollar figure and is based on stock prices. The thing about high-tech stock prices is that they are based largely on speculation, especially for emerging industries. High-tech stocks are usually valued at 20 to 50 times their company's annual profits, though, of course, most dot-coms never actually made a profit.
Free and open-source software kind of short-circuit the conventional model of "wealth". For things that aren't given away for free, their real value is measured by the number of people who are willing to pay a certain amount for it, and after the exercise of selling something is carried out, you count up the dollars and see how much "wealth" was involved.
But things that are given away still have value, and there is still a virtual amount of money that all of the Linux users and businesses around the world would be willing to pay for it. There's no good way to count this up, but I have little doubt that it would be in the billions of dollars.
If a tree falls in the forest and nobody hears it, does it make a sound? Of course it does.
Some people might think that Bill Gates is less evil because he has donated billions of dollars to charities. Linus has donated billions of dollars worth of "wealth" to the world also.
But then, there's also more to life than just "wealth". Wealth is only a means to achieving a high standard of living. Has Linux improved your standard of living? Has free software? Is there anyone who has ever used the Internet who hasn't made use of free or open-source software?
Re:Great stuff! (Score:2, Funny)
Too much back patting.. (Score:4, Insightful)
Re:Too much back patting.. (Score:2)
Re:Too much back patting.. (Score:2)
I might not have read it if Linus weren't involved (I might have nevertheless, 'cause Alan Cox is involved
Re:Too much back patting.. (Score:2)
You are right, it is a bit of hero worship, but is it also true that Linus Torvalds and Alan Cox are among the greatest programers in the world. Those of us with lesser skill can only gain from thier insights. Although I would not be interested in listening to you and your friends talk about the meaning of life, I would very much like to listen to Emerson and Thoreau or Freud and Jung speak on the subject.
Re:Too much back patting.. (Score:2)
Ever heard of a guy named David Cutler?
OH Please, all he proved was if you strap a big enough jet engine (MONEY) to a brick (WINDOWS) it will fly.
Sometimes evolution is necessary (Score:5, Interesting)
Can anyone really say that computing as a field or science was designed? What we have today is the result of a form of evolution and a result of a market economy. Nobody knew where we were going, we just started going someplace.
The company I work for has spent the past 4 years slowly evolving a fairly complex graphics and haptic (see: Intelligent Scalpels Through Touch Technology [slashdot.org] for more about haptics) API. At the start we had only a vague idea of what it should be like. We knew from our experiences in graphics that it should be scene-graph based -- so we borrowed the VRML design. We knew that we wanted to be able to do a few things with it. This gave us the basic framework to start with, much like Linus had with Linux.
Then we basically evolved the product. Every time we worked on a project that used the API, we learnt more about what it was good at and what it lacked. We modified it, fixed things, extended it with new features. After 4 years we have something far better than we could ever have dreamed of designing.
The most important reason for using this approach was not because we believed in an evolutionary approach to software engineering (I don't think that Linus' advice should be taken too literally). It was because we were dealing with making an API out of cutting-edge research - much of which hadn't been done when we started. We simply couldn't have designed it.
Ecosystem biased against small players? (Score:5, Insightful)
Following that thread, can I now propose Linus' Law:
Any software system with a large enough user base can rely on the accumulated experience of its users to add features, and also picking ideas from smaller systems now and then (at a very low incremental effort).
Corollary. The onus is on the smaller players to come up with new features to distinguish themselves from the masses -- but ultimately it's no-win for them because their *really useful* ideas will be subsumed into more popular systems anyway
I need sleep and I'm quite possibly not thinking straight, but am I right in thinking this would create enormous pressures for specialized players like Sun and Apple (and Be, as they found out) in the long term?
If that is the case, where does that leave the "small is beautiful" rule? Does it mutate to "small is beautiful, provided you are part of a *big* idea that has incredible amounts of 'traction'"?
Re:Ecosystem biased against small players? (Score:4, Insightful)
I'm not sure the number of users is important so much as the number of developers/contributers. Or if Linus is correct, the number of developers with different agendas.
In fact the whole debate starts to sound a bit like ESR's Cathedral & Bazaar.
Re:Ecosystem biased against small players? (Score:2)
User group A could be CS types who'd see nothing wrong with compiling odd-numbered kernels for breakfast and who drool over things like CML2. User group B could the Mandrake using types (or even Mac using types
End result: a variety of users leads to a variety of solutions, which ultimately enriches the platform. One downside: there's (sometimes massive) duplication of effort (KDE/Gnome
Re:Ecosystem biased against small players? (Score:2)
Not in general with proprietary software. A lot of people use Sun hardware for a lot of different things, but only one group of developers (or one 'agenda' assuming those developers are kept on a tight leash) actually gets direct input. So if Sun management decides that massively parrallel SMP boxes are where the money is, users who want to cluster a few hundred 2 processor systems together get less attention.
Re:Ecosystem biased against small players? (Score:2)
Any software system with a large enough user base can rely on the accumulated experience of its users to add features, and also picking ideas from smaller systems now and then (at a very low incremental effort).
Corollary. The onus is on the smaller players to come up with new features to distinguish themselves from the masses -- but ultimately it's no-win for them because their *really useful* ideas will be subsumed into more popular systems anyway ... only a matter of time.
This also reminds me of what Judge Jackson described in the section of his findings of fact [usdoj.gov] against Microsoft: "Barrier to Entry"
Evolution in software is not clearly defined. (Score:2, Insightful)
being able to keep yourself alive.
What does survival mean in software terms? Does
it mean that you make the most money (Microsoft?),
that you get to have the most users?, that you
endure in time and get written in textbooks?,
that you show clear technical superiority?
I think that any of these can be taken as
proof of "survival" of a software project, yet
the fact that MS-DOS lasted extremely long and
became extremely popular cannot possibly
mean that it is something we want to copy or
admire.
An argument that I would happily accept is that
evolution exists in linux-world as the result
of survival of different linux
ideas/implementations (e.g. new VM, new
low-latency etc) in the linux user subspace.
Now, the linux users space is a group of
technically aware people (?!) and evolution
of different linux variants in that space
can be said to be constructive in a technical
sense, thus producing real progress.
This process cannot universally guarantee
software quality (from a purely technical
standpoint)
P.
Re:Evolution in software is not clearly defined. (Score:2)
Humans are like Microsoft! (Score:2, Interesting)
--exactly what it means in biology. things that survive from a biological point of view are necessarily good or better. sometimes they are. sometimes they're not. humans survived because they were able to overcome certain hardships created by the world. But i dont exactly admire humans; if you read Ishmael by good old mr. Quinn, he clearly (as do I) dislike human nature, despite the fact that we can't avoid it. humans do the exact same thing that microsoft does: they kill everything around them, and take more than they need.
There was one thing that Bill Gates did not foresee: the advent of a FREE os...something that he could not counter. the human race (analogous to M$) has killed everything, and eventually there will come a species that can not be killed off (in my opinion this will be the sentiet AI that I, err...i mean people will create). However, until there comes along something analogous to linux, humans will continue to dominate.
QED
linus approved kernel hacking procedure (Score:5, Funny)
Re:linus approved kernel hacking procedure (Score:2, Funny)
"If it compiles, it's good. If it boots, it's perfect!"
I don't agree completely with Linus (Score:4, Insightful)
Biological selection does nothing except removing the weak ones, it cannot automatically create systems which work well.
In short, I believe the biological selection is just that, selection. The creation of stuff will need some direction.
And I have to nod vigorously to that. Even taking the model of accelerated evolution through human breeding of species: you direct two animals together to breed. You don't just let the Ps, the F1s, the F2s, etc. just all wander around in a pen, have a sniper sitting on a post shooting the ones you don't want, and hoping the rest go at it...
crap!!! (Score:3, Insightful)
you don't understand the concepts of evolutiuon, and neither does mr. van riel.
biological selection (actually, the terminology is "natural selection") does not work by weeding out the weak ones. natural selection favours the multiplication of successfull ones (ie 'survival of the fittest').
the argument you (and rik van riel) are using, is essentially the same as most creationists use: mutation can only break down and not build up.
this is wrong. read some darwin before you comment on this stuff please.
regards,
meneer de koekepeer
The speed of evolution (Score:2, Insightful)
Another important point is that in this evolution - tough on som level about "survival of the fittest" - there is a certain level of continious "trial and error". This is in fact the way most programming - and learning - is done and this is done through the lifecycle. In real life, DNA can't remember actions carried on by their owners.
Evolution vs. Creation (Score:5, Funny)
Us: "God! At last we have found you! Now tell us, please... WHY ARE WE THE WAY WE ARE? WHY ARE WE HERE?"
God: "I dunno. I created you to eat the lions, and you just kinda got out of hand"
This Happened to Me (Score:2)
I was about to give him a really witty answer but the power blinked, and that was that. Too bad I didn't bother to record the co-ordinates :-(
Don't do it Larry! (Score:2, Funny)
Don't take it too hard, Larry. Stay with us!
Rik's thoughts (Score:2)
Subject: Re: Coding style - a non-issue
[...]
Biological selection does nothing except removing the weak ones, it cannot automatically create systems which work well.
[...]
He's just got a thing on CS (Score:2, Insightful)
the problem with that witty finsk is that he appaprently was forced to endure a few real bad CS classes back in Helsinki.
He's wrong, of course. Whatever works in Linux works because at some point somebody did some serious thinking before starting to spew out code. Planning data structures. Maybe even read about how others tackled the problem.
Thats called Design. In a few areas Linux serously lacks design. and it shows.
f.
Re:He's just got a thing on CS (Score:2, Interesting)
I understand Linus to be saying that he didn't forsee things like the iPaq or the OS/390 port when he started his terminal program in '91. That's strategic. That doesn't preclude him from designing an API or a data structure. That's tactical.
Linux does not evolve like species at all (Score:2, Interesting)
Bad comparison with structual engineering (Score:2, Insightful)
"Engineering does not require science. Science helps a lot but people built perfectly good brick walls long before they knew why cement works."
To me, this seems to be a very poor analogy. The fact is that before the widespread use of maths and materials science in structual engineering ('building a good wall'), structual engineering didn't really exist at all - there were just builders and designers. 'Engineering' only really began when the science was added; before it was an art or trade. As for building a 'perfectly good wall', yes, the walls did indeed usually stand up; but:
a) Not always. Take the case of medaeval cathedrals. In order to stop the weight of the roof pushing the walls apart, the walls had flying buttresses built for support; however in some cases the buttresses actually were so big that they collapsed the wall in the other direction!
b) Not very efficiently. Due to the builders being unable to optimise their design, buildings were often very wasteful of materials in design.
c) How many medaeval skyscrapers were there? You just can't build many of todays huge structures without 'sciencey' engineering.
All in all, I think Alan would have been better advised not to compare it to building a wall; the problem is more that an operating system has such wide scope and enormous complexity (due to different areas of code affecting each other), as well as being flexible enough to change over time, that it isn't feasible to desgin the whole system as you would a dam or skyscraper.
Chris Cunningham
From the /. perspective.... (Score:2)
-Spackler
I'm not claiming to be deep, I'm claiming to do it for fun. -Linus
Stop, you're both right (Score:2)
O.K. My $0.02 (Score:5, Interesting)
Sounds like a couple of harsh extremes to me.
Of course software is designed. But this does not mean that the design is complete, correct, or optimal. And that's where evolution comes in.
All these people who scoff at formal design do have a point: so many times so called formal designs end up being one way paths to the wrong thing.
The formal design advocates repond by saying, "well, you didn't have a correct design." A fat lot of good that does. I've been part of development teams where there is this mantra of design it, check it, double check it, lets not do anything until the design is complete, because failure is uncorrectable. And you end up progressing e v e r s o s l o w l y. This is design by perfection -- the idea is to be so careful about the design that it can't be flawed.
Of course, this never works. Nobody can make anything non-trivial right the first time around. It requires some kind of step-wise refinement. Now, this does not mean the design should be abandoned, but one should design in anticipation of making mistakes. Then, the design permits the local correction of errors, without them becoming a global fiasco.
Design for flexibility then: separate APIs from implementations. Version your APIs so when they're lacking you can produce a new back-compatible version. Don't know all the details about every possible kind of device? Gee, throw in an open-ended IOCTL into the device control API. Refine IOCTLs for similar devices later, when we figure out what they need besides the basics.
The point is that it is possible to design adaptable and refinable systems in order to accomodate the inevitable "opps" with a fix that is local and not global in nature. Now, you can't be flexible in everything and sometimes correcting things hurts: witness the Linux VM. It wasn't really planned to abstract it's API away to allow for interchangable plug-ins, was it. And the VM wars were somewhat painful precisely because one had to chose and couldn't punt.
Nevertheless, experienced software designers try to provide an "out" whenever they can, and think that a particular course might require modification in the future.
Re:O.K. My $0.02 (Score:2)
The point of course, is to guide the evolutionary process by letting it flourish where there is greatest uncertainty. People familiar with animal husbandry (insert ob. ignorant wisecrack comparing this to bestiality) [1] know all about this.
[1] For those now familiar with the term, "animal husbandry" refers, in basic terms, to the breeding of animals for specific traits.
Re:O.K. My $0.02 (Score:2)
Now, this isn't perfect: you can't punt everything, and teams that try never finish a design and get to implementation.
Design is a tool to help you minimize error. It isn't perfect but it does have a beneficial effect. So, it stands to reason that application of design processes to try to predict and manage errors will have a beneficial effect, but will not be perfect either.
Population size, not evolution. (Score:2)
[Linus]
> Quite frankly, Sun is doomed. And it has
> nothing to do with their engineering practices
> or their coding style.
It may have everthing to do with evolution, but only because the baby is growing up, and the engineers have little or nothing new to toss out of the womb.
Sun made middleware happen; now MS is cloning it and taking their one big chance. Less evolution and more population.
Sun and SGI could have made small fairly inexpensive game cubes years ago -- cubes that could have doubled as engineering workstations or even clustered. They chose not to, going for the server market. Neither has seriously approached the asian manufacturing giants. Poor thought processes up front!
IMHO, of all the Unix giants, Irix and SGI had the best chance to make it big -- Irix, for all it's flaws, did the best job ever of hiding Unix, and they did it many years ago. Too bad they dropped the ball and failed to make a consumer device with the help of asia.
MS rises, and continues to rise. It may be evolution in the end, but it's the overwhelming size of their population, not the superiority of product.
Add in the failure of the USA to enforce it's laws, add in the poor strategies of the big iron Unix corps, and there you have it. Little evolution, since their was never a competitive population.
really? (Score:2, Interesting)
If you want to see a system that was more thoroughly _designed_, you
should probably point not to Dennis and Ken, but to systems like L4 and
Plan-9, and people like Jochen Liedtk and Rob Pike.
And notice how they aren't all that popular or well known? "Design" is
like a religion - too much of it makes you inflexibly and unpopular.
I hardly think that plan9's unpopularity is down to that fact that's it's been well designed!
working in it is a joy. It suffers from lack of a good web browser (not exactly a small undertaking) and 23 char filenames (wave bye bye to those ream soon now [tm])
but I guess not everyone likes design. I'm sure more ppl reading this are in an untidy hell hole of a room. If you've not got some dirty crockery in reaching distance of you then I doff my hat to you.
but good design brings pleasure, and working with plan9 brings more joy than frustration.
linux is winning not because it's a great piece of software but rather one of those historical flukes of the right place at the right time and captured people's imagination. Feeding my pc with my first slackware floppy disk set was liberating and discovering the joy of hitting co-operate rather than default has justly brought it's reward.
but hey, come on, keep your mind open. there's always a spare pc lying around, spend an evening with somethign else for a change.
http://plan9.bell-labs.com/plan9
Philosophy vs. Software Development (Score:2, Insightful)
However with respect to their opinions on philosophy (and biology), they are, as a previous poster commented, quite undergraduate. Actually I might be inclined to say worse about them, as I am self educated beyond High School, and I am aware of a much broader world of philosophy (and biology) than they seem to be.
Actually, it reminds me of nothing so much as Alan Cox' posturing on the DMCA, where my opinion was that people who do not understand such issues at all should refrain from making lawyerly or political comments in a broader public form where they have respect that is not meritted for the comments they are making.
While it is nice to see these people expressing interest in broader topics, I feel that they should keep their public discussions to the issues of which they have some understanding, namely software development. All that can come of their ponderings otherwise is to spread their ignorance further than they already have.
Re:Philosophy vs. Software Development (Score:2)
I don't disagree, but I'd be interested to hear where you think they are being short-sighted.
I feel that they should keep their public discussions to the issues of which they have some understanding
Is it really that public? I doubt the linux-kernal ML is that widely read, and it's not like these guys *asked* for their conversation to be posted on slashdot.
I'd say that if there are philosophical and biological issues that are relevant to kernel development then the kernel hackers have a right to do their best to hack through them with their limited knowledge. It's not like a Ph.D. in this subject is going to post in lkml about this stuff.
If I only spoke about what I was an expert on, I'd never speak, and I'd learn much more slowly as a result. Sometimes it is best to hold your tongue. Other times you need to put your half-finished thought out there because the community needs it to be finished and you can't do it yourself. And sometimes even the experts get so comfortable in their field, they don't see a good, new idea until an "ignorant" suggests it as an under-understood idea.
-Erik
Design vs. Evolution (Score:5, Interesting)
I didn't read the whole thread rant with Linus et. al. - but from my own experience and observation EVERY successful project mixes both initial design and evolution in design AND implementation. If you fix the design absolutely up front at both the macro level AND of every sub-system in a large project, you will invariably run into huge roadblocks at some point. Something will not work as planned. As I see the Linux Bazaar process, it reaps benefits when this happens - some person or organization stumbles into a roadblock with poor networking code, poor SCSI subsystem behavior at high loads, or an unreliable VM. These emergent behaviors may only affect some small portion of the user base - but the subsystems then enter an evolutionary phase where people varyingly fix what's there or design something new, and some design ends up surviving based on what the most people seem to like and want and in the end, if all else fails, what Linus dictates.
So no, this isn't strict "evolution" after the style of Darwin. If we let purely random decisions drive software and forked every few minutes, the analogy would be pretty complete. It would also take as long to write good software as it does to evolve a well adapted creature. An eternity.
I see where the idea of selective breeding comes in - Linus sees himself and the kernel leading guns as picking and choosing the best patches and suggestions. Up to a point, this means they are exercising design and discretion, but they generally don't "assign" work from their central database of TODO tasks to IBM, Red Hat, and other individuals or organizations participating in kernel development - those organizations and individuals scratch their own itches and their work usually finds its way back into the kernel. Other posters accurately said that a more random evolution could be effected by letting people check in free-for-all into CVS. This is true, but I don't think that would necessarily improve the results and timeline of kernel development.
You have to realize that the comparison here is, as others pointed out, to a monolithic software development process - in the Cathedral, a centralized decision is made - "we are going to make Windows NT better able to support large enterprise database deployments" and a team is assigned to break it down and work through all the implications, then implement. In Linux-land, the interested parties don't call to schmooze with MS biz dev people who pass info down to technical guiding councils, they pony up and write their own patches to the subsystems they see that need improvement. If there are enough interest parties, presumably enough patches will get submitted that the best from all get incorporated into the set of relevant subsystems that effect large enterprise database deployment, and we end up with a Linux kernel that supports exactly that. Of course the primary difference is that at the same time, somebody else may have made complementary and/or conflicting changes to make Linux a better desktop OS. Chaos ensues and flames erupt on kernel-dev and wonderously, eventually, something better for everyone results after compromises are made.
Re:Design vs. Evolution (Score:2)
Evolution, Maintenance, and Engineering (Score:2)
In engineering, maintenance is performed for one purpose: to achieve homeostasis. For example, a building is maintained so that it remains standing, etc. With software, maintenance consists of homeostatic things (bug fixes), but also of things to enhance, or change, functionality. You would never add new storeys in the middle of a high-rise building, or modify a jet fighter to carry large amounts of freight. Yet changes like this do occur with software.
And they always will occur. Or at least they should: software that is not receiving change requests is software that is dying. Be glad for those requests. And don't complain when users change their minds, or don't really know what they want. Users are people. In geek-speak, this means that they are not reprogrammable: you must deal with them as they are.
When developers really accept this, they tend to accept that the correct paradigm really is evolution. I dream that more advocates of the engineering approach to software will someday be among those people.
Linus is so very way right (Score:5, Interesting)
I agree with Linus.. projects that I've spent several years on came out at the end with features and design elements I could never have predicted going in. I've spent 6 months doing design work on pen and paper at the start of a project, and during the years of implementation thereafter, far more 'design' was done by reacting to the state of the code in any given moment and the problems it was having both internally and with regard to the userbase. My biggest project has evolved tremendously, even though I was essentially the only coder working on it for most of its existence. I can't imagine, then, how much less 'designed' by any individual the linux kernel must be, with the hundreds or thousands of developers contributing to it.
On the topic of Sun's doom, I understand why he says that. Sun's software is co-evolved with their hardware, but neither change very quickly. Linux has to cope with a much more wild, much more genetically diverse hardware base, and as a result it tends to move faster to support new types of devices. Solaris on Intel is a joke compare with Linux on Intel in terms of its hardware support.
Of course, there is nothing magical about a process that allows more evolutionary freedom.. if the hackers working on it don't have the good sense to be effective natural selectors and mutators, then the process won't have a terrific outcome. Linux is thriving because it has so darn many hackers working on it, and because it has so very, very many users using it, and because Linus has a deep and proper understanding of both good taste and evolution.
Re:Linus is so very way right (Score:2)
Actually, it can be a bit of a joke on SPARC too - I have a SunBlade 100 sitting next to me that ships with a smartcard reader by default. And a note saying that, er, sorry about that, we don't have any Solaris drivers for the Smartcard reader we included yet. As far as I can tell, there still aren't, and this models been out for months.
However, I've just heard from a friend that if we were using OpenBSD 3 on the Sunblade, et voila, working smartcard reader.
Why do so many people miss the point? (Score:2, Insightful)
got through half of it. (Score:2)
-Restil
Individuals vs Groups (Score:2)
Individuals are great for design -- particularly if they have some other individuals with whom they can communicate well for reality checks during design. Consider Seymour Cray's designs [ucsf.edu] -- not very complex by the standards of today's computer systems, but Cray's ability to pick a team and then listen well combined with his individualistic design habits led him to beat IBM's army of well funded PhD [geocities.com]. The problem with individuals is that there is a natural limit to the complexity that an individual can fit in his head -- where the internal bandwidths of an individual's mind are enormous enough that engineering tradeoffs can occur at rates vastly exceeding those allowed by the bottlenecks of verbal and/or literate communication.
Similarly, it is a mistake to believe that once a gifted individual's limits are hit that a group of gifted individuals are going to be able to beat a broad evolutionary process in advancing the design.
That's why the gifted individual designer's first and foremost design goal should be to maximize the evolutionary flexibility of his design -- so that the advantages of individualist design are maximally leveraged before complexity dictates that distributed evolution dominates further design.
PS: As for Torvalds' understanding of evolution and breeding -- he underestimates the importance of niches. It is precisely the ability to fill niches that makes an evolutionary system viable. Consider, for instance, sexual reproduction's tendency to, upon encountering the periphery of ecological ranges where population is sparse (like, ahem, Finland) automatically inbreed and therefore express mutations -- most of which fail, of course. The point is that without expressing those mutations the advantages of new genetic patterns can't translate into population increases at those peripheral ranges. Linux isn't a good example of this, since UNIX was a well-populated "ecological range", so Linus should take care not to generalize too far his insights derived therefrom.
I Think We've Been Here Before (Score:2)
Evolution of Design, not code. (Score:2, Interesting)
We could probably design a new, better human, but sheer evolution will _NEVER_ result in perfection. Design can perfect many small peices of code. Combining these smaller pieces, one can achieve near perfection in a lot less time than sheer luck. Evolution produces local minima's, where design can find the absolute minimum error, and move toward it much quicker. (Think if multiple layer perceptron networks.)
Powerful words (Score:3, Insightful)
I must say, the following quote from Linus from the article is one that strikes me with fear and awe:
When someone with the prestige that Linus has says something as powerful as this, I cannot help but feel that this topic is something that he is absolutely passionate about, much in the same way Stallman is passionate about Free Software. Linus doesn't seem like the type of person to use this sort of phrase on a whim; like he says, "I'm deadly serious".
For those who understand how right Linus is (Score:2, Interesting)
The challenge is to come to terms with the fact that the bulk of humanity will never see the deep truth in what he is saying.
The mythology of design is pervasive but just plain wrong. Design only ever happens in marginal increments. Quotes about standing on the shoulders of giants come to mind.
A deeper challenge is that most people are incapable of understanding evolution, not because of any lack of inherent intelligence but because they haven't ever gotten out of the comfort zone.
An interesting but neglected mid-80s paper by Marcia Salner, then at the Saybrook Institute in San Francisco, but now at the University of Illinois at Springfield, as I see following a Google search I now need to spend some time following up on, pointed out that it helps greatly to have got through some genuine crises, firstly to break our naive and seductive faith in the universality of right and wrong answers and secondly to force us to look beyond the naive relativism which first replaces the right-wrong dichotomy.
Evolution, be it biological, social, technical or whatever, is about what works in practice, and even more so about the uses made of its products, because evolution does not happen in a vacuum. (Yes I am using "vacuum" metaphorically. The real vacuum of 3D space is also highly evolved.)
Now I find myself caught up with the even deeper challenge that if too many people actually believed what Linus is saying that the whole system would collapse. It seems only possible to build viable social institutions on rhetoric that does not stand scrutiny.
Left-corner design (Score:5, Interesting)
The idea is simple: when creating a program, start with the most important thing the program needs to do. Once you have that working, add more features. Ideally, as you go, you should be releasing working versions to whoever will be using your program.
This is so right in so many ways. For one thing, if you run out of time during a project, at least you have something you can release, and it may very well do much of what the users need. (There is a line in the book to the effect of "80% of the problem solved now is better than 100% solved later.") Also, early feedback from the users can show you what's wrong with your design, before you write a whole bunch of code that you would later have had to rip out. (I seem to recall an example in the book where a large system spec turned out to be totally wrong; the users didn't know what they wanted until they had something to play with.)
I never before noticed that the standard open-source development techniques match up with the left-corner methodology. Open-source projects such as Linux are all about "release early and often".
When I read Linus's comments, I was nodding my head all over the place. You create some code that solves some problem, possibly not very well. You release it. Feedback and patches start to arrive, and the code grows, possibly in directions you never foresaw. The more popular the code gets, the more robust it gets, as people patch it to work in a wide variety of situations and on a wide variety of hardware. This is why Linux has come so far, so fast.
steveha
Linus is wrong. (Score:2)
This universe might be based on pure uncertainty (as shown in quantum mechcanics) but we as observers are completely outside of this random system and must structure this randomness into something consistent and predictable thru solid design. If not, progress will be slow and fraught with failures (evolution). Evolution is slow and seeing as we are of the universe, yet outside of it (we can observe), we need not be restricted by it's short comings.
Thru perfect top down control we will write perfect software which is second to none!
On the other hand, maybe function precedes form. Maybe it is better to focus on the task at hand, allowing external events to dictate the direction, then to separate ourselves from the environment we are tending, after all, we create things be _useful_ and not just to be used
Maybe.. just maybe there is a point where control is harmful and hinders progress, maybe.. just maybe.. progress is unavoidable.
A lot of talk about Darwin (Score:2)
Linus sounds tired and irratible. I think that the adoption of his idea and the media's desecration of "it's itent" have gotten to Linus to the point that he's comparing hiself to God or Godlike figure who's created a thing that evolves according to natural, hence random, process.
I hope he gets the rest he needs.
Meanwhile go get yourself a Mac this OSX is what I was thinking about when I installed RH5.1 all those years ago ('96?)
Design is Decision-Making. (Score:2, Insightful)
And I submit to you that design is inherent to evolution.
Evolution, in my view, is a process comprised of two cycling stages, as others have pointed out. Mutation is a random process, as random events cause (perhaps a number of) individuals in a species to develop a new trait. Selection is a process of deciding which "mutants" are able to reproduce and propagate.
In biology, is there decision-making in mutation? Depends on what kind of mutation. If a gamma ray snips a DNA molecule, there's no decision made there - it just happens. But decisions can affect mutation. DNA researchers and biologists create mutants in labs everyday. And as a society, we've accepted a technologically advanced quality of life that we know affects our environment and in turn affects us. What goes around comes around.
Decision making takes a more active role in selection and propagation. In anthropology, we measure evolutionary success generally by the number of viable offspring produced by the variation. That means that a successful variation of a species in a world of scarce resources (such as food and useful time) manages and allocates its resources in such a way that it is able to have more children than other variations of the species and thus have more influence on the future direction of the species. Successful management requires successful decision-making. Just try to manage without making a decision and you'll see. It doesn't matter if radioactive spiders turn whole packs of dogs into super-intelligent beings able to telepathically move fire-hydrants and build solid-gold toilets to drink out of - if those dogs decide to spend their time doing that and never have any puppies, they're an evolutionary dead-end. This is actually an issue that's been discussed in Anthro...people we see as being more more successful in our society are having fewer kids than less successful people...anyhow, we see that decision-making (and thus, design) is not mutually exclusive to evolution and in fact plays a large role in it.
In software, mutation could be described as a change to either the source code of a software "component" or the configuration of a collection of software components. Any such modification is a mutation of software, whether intentional or not. Most changes in software, for good or ill, are intentional. Some are caused by gamma rays hitting storage devices and flipping bits, but more are done on purpose as an act that serves some purpose (bug-killing, optimization, etc.) So there is a decision there to serve the purpose via change. There's also a decision to either let a modification stand (because it serves the purpose, or because reversing the change is not worth it), or to revert to the pre-modification state. The decision is there even if it's only to ignore the issue. Decision making and, by extension, design is present in the selection of software changes. You cannot separate design from software evolution, because you cannot separate the evaluation and decision making process from the software development process. Doing such a thing about amount to putting a million monkeys on a million consoles banging away and hoping Linux 3.0 magically results. Statistically it could happen, but animal control would have a cow.
Linus originally decided to go with Rik's VM code for 2.4, then later switched to Andrea's code. Neither move was decided by a coin toss. Evolution? Yes. Design? Yes. It's both, and why can't it be both?
I'll finish by quoting from "Modern C++ Design" by Andrei Alexandrescu (page 4):
arg. (Score:2)
rrr.
Re:Heh (Score:4, Interesting)
WTF do (ex-)linux companies have to do with the quote you posted.
I think this quote has a point.
If we go into comparing, let's say, building bridges and os programming, I think we _can_ see the differences in methodologies one needs.
With bridges, we have a well known and accepted theory of their statics, a relativly narrow expectation what we expect a bridge to do, and we can, by using tolerances of a wide margin, account for the fact that something unexpected happens.
In an os, there is not really a broadly accepted theory (micro- vs macro-kernel, VM, filesystems, implemetation language) - at least when we look how different realisations we see in practice.
What do we expect am OS to do, or more precisly, what do we expect an OS to do well?
latency vs throughput, single vs massivly multi cpu, graphics in kernel vs graphics in userspace
Seems we have no real consensus here.
At last, and this is perhaps the most important factor - we can't make an OS more failsafe (or performing better) by introducing margins anywhere. Due to the binary nature of CS it doesn't make sense to use redundancy for many aspects of an OS.
It either works or fails.
Bridge building... (Score:2, Insightful)
Bridge building has also had tens of thousands of years of trial and error which surely helps. Another thing is competition: There's none in bridge building AFTER you've got the contract. Nobody is going to build another bridge next to it to see if they can make a better one.
Re:Heh (Score:2)
Reviews and audits are also done with bridges, but nobody in their right mind would come to the conclusion that you could abandon margins because of good peer review of the design.
Simply put, in bridge building you have always the *additional* security of safety margins, without waiving the principles of peer review.
OTOH, there you are right, CS has the possibility of easily producing pre-/beta- stuff, something bridge designers can't do to that extend.
But it seems that this isn't enough to make OSs as reliable as bridges
Re:Heh (Score:2)
That Sun manages to make oodles of cash with high margin offerings and is still losing market share is a sign of its maturation and specialization. If you look at teh R&D effort within sun I would bet you that 90% of it is directed towards enterprise level scalability and not common desktop or workstation workloads. Recent comparisons have suggested that Solaris is severely trailing linux in terms of single/dual processor performance. I can imagine this margin only getting larger in the coming years.
Re:Heh (Score:5, Interesting)
This linux system that depending on which "stable" version you download, locks up under high load, corrupts your filesystem when umounting it, invisibly reverts your filesystem to one that can be hosed from a power fail, or kills off processes at random -- like init -- when it starts running out of memory... And that's just what I recall off the top of my head from the last few months.
I don't think Linux is exactly a pile of shit either, but let's not kid ourselves, it's got the same problems that commercial OS's deal with, and the development model hasn't exactly been a panacea in that respect.
Re:Heh (Score:2)
The proper way to use linux, to deploy it, is to make sure that community support verifies the stability of kernel. Redhat's kernel is usually very well tested and will not crash under high loads. 2.4.16 will not crash under high loads. If you use an arbitrary kernel release from Linus then you bypass the one of the critical features of linux -- Community support.
Re:Heh (Score:2)
So in short, it's my fault for using Linux, because new releases in the stable branch are not tested. Gotcha. I'm really trying to mend and stop being so glib all the time, but sometimes it's really hard.
Your admonishments to whatever moral character I might lack based on my criticisms, whining, whatever you might want to call it, have absolutely no truck when I have to make a recommendation based on requirements and Linux comes up short. Lemme save every respondent the bother: I am an ungrateful, whining jerk who spurns the community that provides this free software. I have problems none of you seem to have, and I cast aspersions wherever I can because of it. I am in short, a big fink meanie. Get used to it, there's thousands more like me, and they make recommendations too, so you might want to give the Wagging Finger Of Scolding +2 a rest and start listening to what we have to say, no matter how much venom we coat it with.
Re:Heh (Score:2)
Stability and robustness is measured in the field, not with a bug count on an arbitrary kernel release.
Redhat's kernel is as arbitrary as Linus'.
You own linux as much as Linus does. That means GPL protections for non copyright holders. Linus only has a very small percentage of the kernel copyrighted to himself. So if you felt like forking the kernel you could and no one would be mad at you, except some slashdot trolls. The kernel has been forked countless times for many many reasons, including forking a development branch (2.5 is a fork off 2.4), forking for real-time, forking for embedded developments, forking for MkLinux (linux sitting on top of a microkernel), etc.
So please stop the charade and doom-and-gloom bashing of the linux kernel. It just plainly nonsensical.
Re:Heh (Score:2)
The orginizational "problem" have you taken a glimpse of really only has to do with the travesty of the Van Riel VM. What a mess that was. If linus had chosen Andrea in 2.3.53 (when andrea and rik were competing) then we wouldn't be anywhere close to the mess we're in now. Shit happens, and linux survives. 2.4.16 is top notch. 2.5 is underway with a very impressive to-do list. I'm expecting great things from this kernel in the future, and so far your harping on a developmental accident hasn't changed my mind in the slightest.
Re:Heh (Score:4, Insightful)
Honestly you shouldn't be too worried. The shit _hasn't_ hit the fan, and 2.4.16 is ROCK solid. _Yes_, 2.4 took a long time to stabilize. It's there now, after the Van Riel vm was tossed aside. So lets cut the crap and call a spade a spade: 1) Linus is not a stable release maintainer. If linus puts out a kernel it needs to be tested. Linus does not put out release candidates. Only a fool would use a product that has been released without prior testing. 2) 2.4 took so long to stabilize because of the mistaken beleif that a BSD style VM was best for linux. 2.4 doesn't have the infrastructure to handle it (reverse memory mapping, etc). 3) 2.4.16 is a fucking great kernel. Except for a few possible bugs (the source tree is 149MB uncompressed!) I know of no problems whatsoever. 4) 2.5.x is already starting off with a bang. the new block/io layer should kick major ass, along with all the other enhancements planned. 5) Quit your whining. The sky isn't falling, alan cox isn't retiring to the hills to become a hermit, and linus torvalds knows what the fuck he's doing.
Re:Heh (Score:2)
Funny how none of that stuff has happened to me (2.4.13). The umount bug (iput) was just plain dumb - detected immediately, fixed within the day. Now, I think you're being just plain childish about the ext3 fstab issue, this is just a usability issue that is being addressed. All in all, sounds like bleating to me.
If you're worried about stability, use the kernel that comes with your distro. The rest of us would probably prefer to take our chances, just so we can keep flying at the front of the flock.
Re:Heh (Score:2)
and on the other hand, a horse cannot evolve wheels, because the intermediate steps between a legged horse and a wheeled horse would not be able to move. pity because a wheeled horse could be faster...
just look at biological species to see that a process of evolution rarely results in the optimal design, and is unable to take U-turns or back out of dead ends...
Re:Heh (Score:2)
I'm not sure if you're describing the shortcomings of a horse or of evolution. Assuming the latter, I don't think evolution needs to make U-turns because rather than turning it forks. In other words, if species X becomes extinct you can view that as a failure of evolution, but it's not. Because somewhere in the past, species X diverged from species Y.
Re:Heh (Score:2)
Flightless birds dont get their hands back, marine mammals don't get their gills back, we stand up straight, but our spine can't take it, our yes attempt to focus by bending/stretching the lens rather than moving the lens back and forth, etc etc.
Evolution as a method works because it achieves results without requiring a plan or a design.
However if you do have a specific goal set, such as 'we want an application that solves this problem', then a 'try 1000 different angles to throw 999 away isn't very efficient... A proper design might allow you to throw away the 999 redundant ones before work has even begun...
Re:I'll take that bet (Score:2)
With a giant internal combustion engine and treds (not wheeles). You're using far, far more energy to move that tank then to move that horse.
Now, if you could do it on a mountain bike, you might have a point. But I think The real issue is that speed isn't the only optimizing factor. There are a lot of other things involved as well, and energy efficiency is one of the most important.
Re:Heh (Score:2)
Re:science and engineering (Score:5, Insightful)
Science provides a lot of dandy tools. Engineers like tools.
Engineers would be useless without science to provide new raw materials.
Baloney. We (engineers) were building all sorts of impressive stuff long before the invention of science. Check out the Great Pyramid and Yu the Great.
Re:science and engineering (Score:2)
Engineers have to build things to get paid by their customers. If the things work reasonably well and don't fall down, engineers get paid.
Engineers use science when available. If it's not, engineers hack - they base their results on trial and error (BTW, note that most engineering disciplines spent a lot of time analyzing failures).
Sometimes engineers use science that's wrong. Sometimes they get away with it (if large enough safety factors are applied) and sometimes not.
My favorite story comes from the book "Design Paradigms" by Henry Petrosky (sp?). Galileo's formula for the strength of the cantilever beam was wrong. Yet it was used in construction of bridges for few hundred years. Only when the engineers tried to reduce the cost by shaving the safety factors and bridges started to fall down, someone went and looked back at the math and discovered the mistake.
Re:Evolved Code? (Score:2)
Re:Some thoughts on evolutionary theory (Score:2)
You're comparing QWERTY to a totally different designed optimal keyboard layout. If you want to replace QWERTY, you have to evolve it slowly towards what users want. Users don't want a totally redesigned layout but they sure like small changes.
Re:Linus is not a creationist! (Score:2, Insightful)
Yes, but he'd be saying "Linux isn't designed, the reason it works is because we have God on our side. We have lots of changes coming from people interested in pulling the project in different directions, including me, but He guides us to accept only the best final outcome. Look at Sun, they're doomed to failure because they try to follow their own narrow path instead of putting faith in the Lord..." etc. etc.
Linux is scaleable... (Score:2)
Just add the support for the advanced hardware features of the E15k and you're ready to go. Get the enterprise level reliability and management features that it needs, and you'll see Solaris floundering on every single bit of Sun's hardware.
"Reasonable"? (Score:2)
False. The evidence is all around you (and in fact you are part of the evidence).
2 there is no reason for a god to exist
False.
3 based on our conception of logic, a god cannot exist
To the contrary, rationality depends upon the existence of God.
4 we know why, when and how the stories of gods were made up and propagated
This, at least, is partly true. It is true, in that we know that evil men in times past refused to worship the true and living God, and instead fabricated false gods for themselves.
5 we know why and how the stories of gods were accepted and used, and for what purposes
You're repeating yourself.;-)
Far from being "undermined" by belief in God, as the evolutionist fantasizes, rationality is actually dependent upon God. On the other hand, rationality is undermined and utterly demolished by evolution.
Re:"Reasonable"? (Score:2)
Why? Because if the evolutionist is correct, then what you and I say are "thoughts" are really nothing more than highly complex chemical reactions. Yes?
So please tell me how a chemical reaction can make statements of truth or falsehood.
Answer: it can't. Chemical reactions are incapable of that; it's absurd to even suggest the possibility. It's like asking fermentation its opinion about the next presidential election: it makes no sense to even ask the question, let alone hope for an answer.
So why on earth would it be sensible in any way to suggest that the chemical reactions in your brain are in any epistemologically significant way superior to fermentation? It isn't sensible. Impersonal things don't make truth claims. Only persons do that. But at root evolution means that we are just bags of chemical reactions - and so at root we aren't persons. So we can't make truth claims.
So, on your erroneous terms, rationality is destroyed. So you can't even be consistent to your own framework and at the same time tell me that it is "right" or "true" or "correct" - because your framework doesn't allow for such categories. Because there is no "you" available: there's just a bag of chemical reactions.
But, of course, there are persons. And that is why evolution is wrong. It requires that the false be true: it requires that we are just bags of chemicals and not persons.
Re:"Reasonable"? (Score:2)
Aw, heck, this takes me back to the old days on alt.atheism. For nostalgia's sake...
We don't know precisely how thought and intent arise from chemical processes. But we have ample evidence that they do; just tackle it from the opposite direction. When you take away those chemical reactions, what's left?
When you damage the brain, you damage the capacity for thought. Read "The Man Who Mistook His Wife For A Hat", by Oliver Sachs, a neurologist who writes like he swallowed a poet. It's a collection of case histories that illustrates my point quite well. Some strokes remove the ability of a person to, for example, consider the idea of "left". They only eat the food on the right side of their plates. When asked to imagine walking down a familiar street, they only describe the objects on their right. Ask them to imagine turning around and walking the other way, they forget the buildings they just described and describe the ones they forgot.
Severe enough damage to the visual cortex not only renders a person blind, they lose the entire concept of vision. Words like color, light, etc. don't make sense; they've forgotten not only that they could see but that sight itself exists.
Damage to Broca's and Wernicke's areas of the brain results in different types of aphasia [imssf.org]. People with Broca's aphasia have extreme difficulty speaking, but can often understand speech well. People with Wernicke's aphasia can talk, but they don't understand what's said to them and speak in what's called "word salad"; a stream of nonsense. Put two patients with Wernicke's aphasia together and they'll have a complete gibberish conversation, without even apparently realizing that they aren't saying anything.
I could go on and on. The point is, damage the brain and you damage the ability to think, to emote, to be a conscious individual. I don't know of a capacity for thought that can't be destroyed by damaging some area of the brain or another.
To put it bluntly, I don't see what's left for a soul to do. If I have a soul, I can't see why I should care what happens to it after I die; what's left after my brain is destroyed can't be said to be "me" in any reasonable sense.
Now, as I said, we don't know exactly how these chemical processes give rise to consciousness. But even if you didn't know how a car worked, you'd notice that if the engine is removed it won't work anymore.
Because there is no "you" available: there's just a bag of chemical reactions.
Straw man. Obviously there's a "me"; here I am, cogito ergo sum. But I object to your use of the word "just" above.
In one sense, there's no such thing as a "rainbow"; just billions of tiny water droplets of the right size, positioned such that they reflect and refract light in just the right way to separate out the hidden colors in white light. But in another sense, there is such a thing as rainbow; it's just on a different level.
Like a rainbow is something water droplets do, so the mind is what the brain does. The mind is a process. A rainbow is not 'degraded' by having arisen from 'mere' physical processes. Physical processes are ennobled by giving rise to such beauty.
I find people just as valuable as you do (perhaps more; I think we're lucky to be here and that we don't have a cosmic protector so we should be a little more careful and considerate with each other). But I think I've got a clearer conception of what they actually are, and why.
Re:the author of life (Score:2)
Great! Now read it in English again, and this time go for understanding. ;-)
Re:the author of life (Score:2)
If you insist upon being serious, your real issue is willful rebellion against God, as manifested in your denial of the truth that you were created by God. Willful rebellion isn't corrected by understanding something correctly; it's corrected by stopping the rebellion.
Oh yes I did (Score:2)
I misquoted nothing. Did he say it or not? No one is going to pretend -- certainly I am not -- that Gould is not utterly devoted to his evolutionist fideism. So it would be ludicrous to even think that Gould would not attempt to explain away the facts he admits in the quotation. Of course he makes the attempt. Duh.
actually caught - in the fossil record - the critical transitions.
Rubbish. No "transitions" have been caught anywhere. For this to be actually verified, you would have to have a fossil from every generation between parent and "evolved", "transitional" child. You would furthermore have to be able to demonstrate that what you have are actually direct biological descendants, or else Gould's "proof" is nothing but post hoc nonsense.
So what Gould has -- as he actually said -- is inference, and nothing more.