Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Linux Software

Linus Says No to 'Specs' 540

auckland map writes to tell us about an interesting debate that is being featured on KernelTrap. Linus Torvalds raised a few eyebrows (and furrowed even more in confusion) by saying "A 'spec' is close to useless. I have _never_ seen a spec that was both big enough to be useful _and_ accurate. And I have seen _lots_ of total crap work that was based on specs. It's _the_ single worst way to write software, because it by definition means that the software was written to match theory, not reality."
This discussion has been archived. No new comments can be posted.

Linus Says No to 'Specs'

Comments Filter:
  • by jg21 ( 677801 ) * on Monday October 03, 2005 @05:06AM (#13702656)
    There's a very good post later on in the kerneltrap thread:

    Linus is an engineer/tech. He dislikes theory work because it often gives nothing in practice.

    However, specs are not always theory, and they may be usefull, as well as docs. He may be smart enough (or know linux code enough) to not need any doc/spec, but it's not the case of many other people. Some specs are good, and sometimes necessary.

    He cited OSI model, well, but I can assure you I won't go in an airplane if it was done with Linus' practices... There are specs in some places that are good, and that are read and followed. Even in non-dangerous domains such as Web standards, specs are necessary, and those who don't follow these specs make crap softwares/browsers!

    Moreover, in Linux development model, which is fuzzy and distributed, not directed, defining the software may be vain. However, in a commercial environment, defining the spec is really writing a contract, which protects both the customer and the editor. Specs there defines what the software can and must do, and ensures it will do. Linus obviously lacks of experience in industrial and critical projects. He may be right for the kernel development (however I still doubt it should be so entire on that subject), but he's wrong on many other domains.

    IOW, Linus does here a generalization which is at least as wrong as are the examples he cited. As we say : "all generalization are false".

    If he finds a bad spec, either it throws it away, or he fixes it. It's the same for technical docs. But he shouldn't tell every specs are useless and bad. That's wrong.

    • by gowen ( 141411 ) <gwowen@gmail.com> on Monday October 03, 2005 @05:16AM (#13702688) Homepage Journal
      He dislikes theory work because it often gives nothing in practice.
      Indeed. And, in fact, for those specs that aren't theoretical, he's followed quite closely (albeit without calling them a spec). There's a lot of work gone in to making Linux POSIXly correct, and POSIX is a spec, even if Linus doesn't consider it one. Similarly for VESA framebuffers, or tty specs.

      Basically, a spec says "X should work like this."
      You follow a spec whenever it's clear that X should work like that (for whatever reasons, be they performance, clarity or interoperability). If it's not clear why the spec mandates that, it's a bad spec, but the existence of many many bad specs doesn't invalidate the concepts of specs.

      By all means pick and choose the specs you decide to follow, but don't make silly generalisations.
      • by ajs ( 35943 ) <[ajs] [at] [ajs.com]> on Monday October 03, 2005 @06:30AM (#13702911) Homepage Journal
        POSIX, VESA and even ttys are all examples of specificatons that sought to unify existing practice. The practice came first, then the theory. If you want an example that goes the other way, you would have to look to something like IP, which was created as a specificaton along with the first implementations.

        Linus is correct, though. Specs are rarely useful breasts up-front. Standardization of existing practice is often useful, but that's another beast.
        • by gowen ( 141411 ) <gwowen@gmail.com> on Monday October 03, 2005 @06:46AM (#13702959) Homepage Journal
          That's a reasonable point. In fact, the problem most people have noticed with Linus's statements is that he fails to draw that distinction. If he said "specs imposed from the top down `this is how we should do this thing'" tend to be bad, as opposed to specs which say, "people have had success doing it like this -- let's all do it that way."

          No-one, as far as I know, is denying that many many many bad specs exist, and that they should all be cheerfully ignored.
          • by duffbeer703 ( 177751 ) on Monday October 03, 2005 @09:26AM (#13703872)
            What Linus is saying is that specs are dumb, and the right way to do things is to let him make or delegate all decisions, and retain the right to arbitrarily change those decisions later.

            A spec implies a commitment, and Linus is used to everything being in the air when it comes to Linux. A spec would also make it easier to fork Linux, and thereby make Linus less important.

            Linus is no idiot... this is clearly a political stance.
        • Specs are rarely useful breasts up-front.

          A perfect example of why specs are useful -- without specifications, the Intelligent Designer's developers would end up developing too many models of women with non-standard breast placement. This would then require way too many bra models which would result in way too much confusion about how to remove them quickly -- it could have lead to the end of the human race before it really got a good start.

      • by ivan256 ( 17499 ) *
        POSIX, like most specs, is designed to allow software to be both closed-source, and compatible. When your code is available, the sole traditional reason for specs is gone.

        Specs that are written by a group of competing organizations are worse than obsolete. They're disaterous. A lengthy revision process slowly, but completely removes all but the most necessary of the specifics from the document in order to match as closely as possible the existing technology from all parties involved. This leads to standards
        • by klaun ( 236494 )

          POSIX, like most specs, is designed to allow software to be both closed-source, and compatible. When your code is available, the sole traditional reason for specs is gone.

          I'm under the impression that this type of argument is usually refered to as the "straw man." You redefine the issue ("there is only one reason for having specs and it is this") to something that you can refute, refute it, and then declare victory.

          But saying open code is the solution for specs doesn't seem quite complete. So are we

        • by GlassHeart ( 579618 ) on Monday October 03, 2005 @02:41PM (#13706834) Journal
          POSIX, like most specs, is designed to allow software to be both closed-source, and compatible. When your code is available, the sole traditional reason for specs is gone.

          You are wrong. You cannot reverse-engineer a spec from an implementation, because you cannot separate the implementation choices from spec requirements. For example, if I show you an implementation of an algorithm, and you realize that it can run within 300 ms, do you know if there's an actual need for it to finish within 300 ms? Or, how would you know if an alternative implementation (that uses less RAM, for instance) that needs 500 ms to run is actually acceptable?

          Similarly, there are almost always things in a spec that a particular implementation does not need (but another implementation would). To piece back together the actual spec would require reading multiple implementations and figuring out the union of all they implement. Worse, the implementations may actually contradict each other. The C language leaves certain constructs formally undefined (for example, "i = i++;" is undefined behavior), so how would you figure out what that's supposed to do by reading compiler source code?

          Thirdly, while source code may be readable by people who want to provide an implementation, it is not readable by people who merely want to use it. When was the last time you learned to program in a language by reading the source code of its compiler? What if the compiler was self-hosting (that is, written in the language it compiles)?

          Code examples are always better than specs because they are unambiguous.

          So the complete source code to Mozilla is a better source of information on HTML than the W3C HTML specification?

      • by Lodragandraoidh ( 639696 ) on Monday October 03, 2005 @09:22AM (#13703841) Journal
        I think there is some confusion between the idea of 'software specifications' and 'standard interface specifications'. One may referenct the other, but they are not the same.

        Software specifications define functional, design and sometimes implimentation details, and in my experience never survive implimentation simply because the developer can never anticipate every obstacle that reality throws at it. Thus the software specification quickly is changed, after the fact, to represent what was implimented. It lags behind reality and serves only to document what *was* done, rather than driving the development.

        Standard interface specifications (APIs, and other standards such as POSIX) serve as agreements between many different players about how a particular interface should work to provide interoperability between all users of the standard. If you want to foster interoperability then you follow the standards that are widely accepted. Without standard compliance Linux would not be where it is today.

        I read the article and it occurs to me that the confusion between them was at the heart this very issue. This confusion is sadly furthered by the use of the term 'specification' to represent both things - which are clearly different. Implimentation of emergent technology is different than established protocol agreements. Perhaps we should couch our language to make those differences clear when we speak about it.
      • by bahamat ( 187909 )
        There's a lot of work gone in to making Linux POSIXly correct


        But Linus also has a history of only following the spec so far as it suits him. I believe the word he used for POSIX's threading model was "stupid".
    • by Rakshasa Taisab ( 244699 ) on Monday October 03, 2005 @05:41AM (#13702772) Homepage
      The summary is twisting the discussion into something very flamebait worthy, and it gives a false impression of what Linus really said.

      From my impression of the discussion it isn't specs he is against, but rather following those specs without being flexible enough to take reality into account. Would you really want to fly the plane built to spec, if some of those specs turned out to not accurately reflect reality? In that case you'd change the specs, but that's not always possible.

      In this specific case, the problem was someone using the abstraction layering described in the spec, while the kernel would according to the others be better off using another design. If the software behaves equivalently, it should not matter how it is designed internally.
      • by Mr. Underbridge ( 666784 ) on Monday October 03, 2005 @08:16AM (#13703380)
        From my impression of the discussion it isn't specs he is against, but rather following those specs without being flexible enough to take reality into account.

        From what I read, it seemed like people were trying to get him to soften his stance on that, and he seemed pretty adamant that he hates specs in any form or fashion.

        Of course, it's easy to do that when you're Linus Torvalds, and whatever you say/do is the de facto standard without the need to write a spec. He's basically a walking spec. However, I'd invite him to consider what would happen if all the peons adopted his theory. Nothing would interoperate with anything else.

        The only thing I can think of is that he defines a spec as something that is inherently written once, before implementation begins, and is strictly adhered to no matter what. However, I don't think any sane person would agree with that definition, I can't imagine that's what the other people in the thread meant by the word "spec," and I can't believe he'd imagine anyone else defending such a process in the first place. So I do believe that Linus is being a bully again.

      • by hey! ( 33014 ) on Monday October 03, 2005 @08:47AM (#13703618) Homepage Journal
        "Rules are there to make you think before you break them."

        Simple expertise is knowing the right thing to do. When you go beyond that to knowing when its the right time to do the wrong thing, then you have mastery. So, when somebody who has years of mastery of a craft says, "the rules are crap" it has a different truth level than when somebody who's merely competent. The difference is the way the right thing to do is backed by unspoken, unarticulated working knowledge in one case, and mere bravado on the other. I can do basic carpentry, but the difference between me and a master cabinetmaker when building a book case is that I have to keep referring to my plans, whereas the cabinetmaker, while he may have plans, operates more unencumbered by them, moving quickly and confidently because he's internalized longer sequences of operations, until he can see the whole construction process in his mind's eye. When I don't worry much about my plans, I end up with a dado on the wrong side of a plank.

        Being "against specifications" of course is stupid. But Linus is in an unique position to be a bit cavalier, isn't he? Specifications do two things. First, they tell you what needs to get done. Second, they communicate this between parties, say the specifier and implementor, the customer and contracter, the builder of tab-a and the constructer of slot-b. But Linux is, if I understand this, a pretty conservative implementation of an existing model, where innovation where it occurs is fairly contained and focused areas. And as far as Linus and the Linux kernel is concerned, L'etat c'est moi. He may well have managed all these years keeping what needs to be done in his head, and the result could still have more coherence than the product of a well coordinated committee.

        The other thing to keep in mind is you can't trust everything anybody says is categorically so, even when that person is perfectly honest and sincere. The simple reason for this that mosts truths have an element of fuzziness in them. In limited circumstances it is sometimes necessary to hold what is, in general more false than true, but in this case more true than false. Wisdom is knowing when and how much to doubt what you believe, or believe what you doubt.
    • by achurch ( 201270 ) on Monday October 03, 2005 @06:01AM (#13702825) Homepage

      I can assure you I won't go in an airplane if it was done with Linus' practices...

      Don't blame the builder if he doesn't have the tools. The real problem is that the field of software development is just too young for the sort of meaningful and useful standards and practices needed for solid engineering to have been developed. Right now, we have:

      • A design process thought out by people who have spent way too long thinking about design theory (in theory, theory and practice are the same--in practice, they're not)
      • A bunch of programming languages, most of which were created by looking at earlier ones and saying "hey, I can tweak this and call it a new language!"
      • A testing methodology that says "throw everything in the book at it, and hope your book isn't missing any pages"

      That's not exactly a recipe for success. Granted, it is possible to make things work right for a single project if you throw enough effort into it (e.g. the Space Shuttle software), but the vast majority of developers don't have that kind of effort available to throw around, and we're still a good number of years (decades?) away from figuring out what everything has in common and how to make it work cleanly. We're making progress, certainly--concepts like encapsulation spring to mind--but there's still a long way to go before you can talk about "software engineering" the same way you do, say, "civil engineering".

      • by Troed ( 102527 ) on Monday October 03, 2005 @06:41AM (#13702942) Homepage Journal
        I'm a Mechanical Engineer and a Software Engineer by education. I'm a software developer/architect by profession. I've stopped believing in the "engineering" part of software development altogether.

        It's a nice thought, but it's not possible to look at software development as a physical manufacturing process. We're much closer to art.

        (In addition, I exclusively work in the embedded/telecom part of software development - areas much closer to engineering than normal personal computing type of development. If these are my experiences, most development should be very far from engineering. Maybe we should stop looking at NASA development as some sort of best practise [Gilb, I'm looking at you!] for all types of software development)
        • it's not possible to look at software development as a physical manufacturing process. We're much closer to art.
          I like to tell my younger colleagues that "We aren't engineers, and we aren't artists. We are craftsmen and -women. We make useful devices through the skills we have. We make pots and pans."
    • by dramenbejs ( 817956 ) on Monday October 03, 2005 @06:08AM (#13702850)
      You are saying it here: However, in a commercial environment, defining the spec is really writing a contract, which protects both the customer and the editor. Spec is not usable for coding, but for employing. You come inevitably to three possibilities:
      1. Your project is trivial and spec may be achievable.
      2. Your project is not trivial and spec is copied from some existing product
      3. You have written the spec after making the program.
      There is no way around this! Or you can predict ALL of reality? I doubt...
      • by HuguesT ( 84078 )
        Code *as* specs

        In a few projects I've worked for, we delivered code *as* specs. I.e. an implementation of some library together with documentation.

        The client took the working code and the documentation, and then re-coded the entire library to their standards from scratch, using the implementation as benchmark. At least that's what they'd said they'd done... I believe they took the code lock, stock and barrel and wrote a tiny interface layer on top, but I don't want to know this.
      • by Marillion ( 33728 ) <(ericbardes) (at) (gmail.com)> on Monday October 03, 2005 @08:01AM (#13703316)
        What I think he is really trying to fight is what the Agile Movement calls Big Design Up Front [c2.com]. BDUF recognizes that customers will never see how a software application will transform their business and will invariably change their minds. Dwight D. Eisenhower summed it up as "In preparing for battle I have always found that plans are useless, but planning is indispensable."
    • by Morgaine ( 4316 ) on Monday October 03, 2005 @06:18AM (#13702878)
      "software was written to match theory, not reality"

      That was very blinkered and unfortunate statement by Linus. While he portrays himself as a "practical engineer", the truth is that he is not flying the flag of professional engineering, but supporting some kind of ill-conceived ideal of ad hoc amateurism.

      The world of computing is in crisis. After 40 years of 'pro' development, computing is still a human-driven craft instead of the extremely precise arm of engineering that it could so easily have become through its well-defined subject matter.

      While Linus has contributed immensely to the world by delivering a wonderful engineering tool as well as a great end-user product, he is also extending the software crisis through unfortunate remarks like that one. The "reality" which he so seems to praise is THE PROBLEM in software engineering, and not something to be endorsed or supported.

      If the world continued along Linus's desired path of "reality" vs theory, the current mess will know no end, and the metaphorical bridges of computer science will still be falling down in the year 3,000.

      Mankind's future in computing must build on immoveable foundations of theory and logic if it is to progress into a realm where machines of IQs in the millions work at our behest. Advocating some sort of ad hoc "practical" computing barbarism is very short-sighted, dangerous, and regressive.
      • Mankind's future in computing must build on immoveable foundations of theory and logic

        Bertrand Russel tried to put mathematics on an immovable foundation of theory and logic, sadly it turned out to be impossible. Some people still don't realize this.
        • by Morgaine ( 4316 ) on Monday October 03, 2005 @07:53AM (#13703274)
          Bertrand Russel tried to put mathematics on an immovable foundation of theory and logic, sadly it turned out to be impossible.

          I don't think that you got very far with Bertrand Russell.

          The higher-order issues he identified caused just a temporary hiccup in the development of logic. While undoubtedly fundamental, that problem paraphrases best as "There are hidden depths to this", rather than as "All is lost".

          Godel applies, as always. You don't apply a theory outside of its domain of discourse, not if you know what you're doing anyway.

          Russell showed that the domain of logic gets tangled if you use it to think about itself. Well (with hindsight) that is no surprise at all. The expert logician recognizes the necessary boundary, and virtualizes the outer domain before it can be handled by the inner domain logic.

          Russell is doing just fine, thank you. Almost the entirety of mankind's technological world is founded on the logic which you describe as "impossible".
          • I don't think that you got very far with Bertrand Russell.
            Why can't anyone here disagree with anyone else without insulting them? I think you too casually discount the magnitude of Russell's dilemma vis a vis set theory and the theory of types. He wanted the impossible, as we now know in a formal sense thanks to Gödel, namely a complete and consistent logical system, and his failure to resolve his paradox was devistating to him (and moreso, I think, to Whitehead, but I may be worng about that). How far
      • by indifferent children ( 842621 ) on Monday October 03, 2005 @07:07AM (#13703053)
        computing is still a human-driven craft instead of the extremely precise arm of engineering that it could so easily have become

        I agree with Linus. And the problem that I see is people like you who insist that the above statement is true, all evidence to the contrary. No company has ever demonstrated a methodology that is guaranteed (or even very likely) to deliver high quality, maintainable software in a predictable amount of time. Software development is still an art, and may always be one.

        • Software development is still an art, and may always be one.

          The problem I see is "people like you" ensure that software development will remain an art. I agree with the parent that software engineering should be based on more repeatable solid foundations, like most other engineering disciplines are. That's not to say there isn't room for artistry in it, however. Take building design, for example. Most structural engineering is based on tried and true formulas and known design patterns that work. Ho

      • by sdokane ( 587156 )
        (1) Copying the engineering profession does not mean that suddenly all the problems associated with software will disappear. Engineering systems are frequently late and over-budget. I can name a few hi-profile examples: the Millennium Dome (UK), the Jubilee Underground Line extension (London), the "wobbly" bridge (London), the Sydney Opera House. The first 3 were "vanilla" projects - there was no real excuse for failure.

        (2) Specs don't correspond will reality because they frequently use hand-waving to achie
    • by indifferent children ( 842621 ) on Monday October 03, 2005 @07:19AM (#13703123)
      in a commercial environment, defining the spec is really writing a contract, which protects both the customer and the editor

      But since the spec is almost certainly incomplete, and likely wrong in several spots, what you have is a contract that means that a judge is likely to rule in your favor, but a customer who hates your guts and a lot of industry buzz about how you are being sued by one of your customers. Get in bed with your customer and discover which of their needs you can meet on a continuing basis. That's a better recipe for success that being 'technically correct'.

    • He's not complaining about specs as a way of getting things to behave the same. He's talking about using hardware specs to base your software on. If you try to talk to your PCI hardware by following the PCI spec, chances are that it won't work, because a huge portion of PCI hardware doesn't quite work right. Someone had been saying that, in order to write a good SCSI driver, you should follow the guidelines in the SCSI spec, and Linus was saying that, if you did that, your results wouldn't work with a lot o
  • Detailed specs... (Score:3, Interesting)

    by jemnery ( 562697 ) on Monday October 03, 2005 @05:06AM (#13702657)
    Detailed specs are useless. A broad spec that defines the general features, who the damn users are and what they need to achieve is far from useless. Let the intelligent software developers figure out the details, but don't let them lose sight of the general direction they should be taking.
  • by Anonymous Coward on Monday October 03, 2005 @05:09AM (#13702664)
    I heard he had good vision. --(o)~(o)--
  • you mean... (Score:2, Informative)

    by Rui Lopes ( 599077 )
    something like this [w3.org], this [w3.org], this [w3.org], this [rfc-editor.org]... (should i go on?)
  • Theory (Score:5, Interesting)

    by StonePiano ( 871363 ) on Monday October 03, 2005 @05:11AM (#13702670) Homepage
    Who was it that said:

    In theory, practice and theory are the same. In practice, they are not.
  • Incase you were wondering if CS was indeed a "real" science or not.
  • Amen (Score:5, Funny)

    by Anonymous Coward on Monday October 03, 2005 @05:13AM (#13702674)
    Linus has spoken.

  • by Anonymous Coward on Monday October 03, 2005 @05:13AM (#13702677)
    ..big _YES_ to underscores.
  • by Sanity ( 1431 ) on Monday October 03, 2005 @05:14AM (#13702680) Homepage Journal
    How are you supposed to write software which interoperates with other people's software without relying on a specification to define the interface? I have read some of the thread and I really can't understand where Linus is coming from here.
    • You read the code (Score:3, Interesting)

      by Rogerborg ( 306625 )
      That's pretty much what it always comes back to with Linus.
      • Re:You read the code (Score:3, Informative)

        by Kjella ( 173770 )
        1) You often don't have the code (is this the "everything should be OSS" argument?)

        2) If you had the source code to IE5, would you consider it a good spec?

        Kjella
    • hardware specs (Score:4, Interesting)

      by LatePaul ( 799448 ) on Monday October 03, 2005 @07:21AM (#13703130)
      That's because Linus is talking about hardware specs. His view is that it's better to trust reality (how the device actually behaves) than a spec (how it's documented to behave). And he's right about that.

      Outside the world of OS kernels there are many software projects where the 'reality' is much more changeable, much less solid. Reality in the case of a Purchasing application or your KDE/Gnome desktop applet or the latest FPS game is likely to be a case of 'what the customer/user wants'. That's something you really need to pin down. Doesn't mean it can't change but it needs to be clearly defined.
  • Specs are not best for software whose features are to grow with time, and where nobody ones what people want to add more. Specs are best when you have a fixed set of requirements, which you have to meet in order to complete your work.

    Still, specs may be useful for example to identify certain aspects of a Linux sub-system. But it may not be desirable to have a full spec defining all the goals of Linux, because these goals are a rapic moving target and therefore steadily changing. Of course, there are some fe
    • Specs are not best for software whose features are to grow with time

      Why not? Specs can change and grow with time as well.
    • Mmmm, but specs can change. Obviously you don't want to change them daily (what's the point???), but they still have a degree of flexibility. Also, if you create specs for all the subsystems, then combine them all together, what does that give you? A spec for almost the entire system (sans the way it's all linked together and interoperates, and it really does make a good deal of sense to create a spec for THAT, too).

      If there comes a point where the particular subsystem spec or overarching "wrapper" spe
  • Missing the point. (Score:5, Insightful)

    by Anonymous Coward on Monday October 03, 2005 @05:15AM (#13702687)
    The whole discussion was centered around implementing specs. And the point made by linus was that one should not implement specs literally, not to structure the software as the specs are structured. He did not say the software should not adhere to the interface given by the specs. So the software should work like specified, one should just write the software in a form which makes sense for the larger scope of the software, not one limited to the scope of the specs.
  • by $RANDOMLUSER ( 804576 ) on Monday October 03, 2005 @05:18AM (#13702693)
    Bill Gates says "Beta testing is for sissies".
  • Spec Change! (Score:3, Insightful)

    by linebackn ( 131821 ) on Monday October 03, 2005 @05:22AM (#13702705)
    And after three long hard years of implementing a huge amount of code as per specifications, finally wrapping things up and looking to moving on to bigger and better things... they go and change the specs! Arrraggg!!
  • I think there are good specs and bad specs. Good specs are the ones that are drawn up to standardize and harmonize things that are already out there. Bad specs are the ones that are written before any implementation exists. In the former case, specs are designed with hindsight, and with the knowledge of what features are desired, and how they are commonly implemented. In the latter case, specs can and will be designed without any knowledge or consideration for practical issues. Linus seems to be ranting aga
  • by kevin lyda ( 4803 ) * on Monday October 03, 2005 @05:26AM (#13702721) Homepage
    Who is this Linus guy anyway? I bet he's never managed a software project of any complexity.

    Personally I've found specs to be incredibly useful. I'm currently developing a middleware project that takes a complex search pattern and applies it to a streams of delimited character objects and while our team of 40 software software engineers has yet to actually start developing we've produced a fantastic spec that will greatly simplify coding it.

    I suspect we'll have this general regular expression parser up in running in less than 80 man years of effort thanks to our full and detailed specs.
  • specs and designs (Score:5, Interesting)

    by idlake ( 850372 ) on Monday October 03, 2005 @05:27AM (#13702724)
    Linus does code to specs: the kernel is intended to comply with all sorts of formal and informal specs, and its developers pay attention.

    What is missing is people writing and committing to specs for some important kernel internal interfaces and functionality. This attitude goes hand in hand, of course, with the lack of stable internal interfaces within the Linux kernel and is one of the major reasons why the kernel source has bloated to such a humungous size and why every kernel release needs to include all the accumulated garbage of the past decade. If internal kernel interfaces were specified and committed to for each major version, then driver and module development could be separated from the rest of the kernel.

    Of course, Linus is right in one area: most specs are useless. There are two primary reasons for that. Either, the spec is poorly written; there are lots of those. Or, the spec describes a bad design; there are many more of those. Many of the original UNIX design documents were exemplary specs: they told you concisely what you could and could not rely on. On the other hand, many recent standards (like HTML or SOAP) are examples of well-written specs that are bad specs because the underlying designs suck. But the fact that many specs are bad doesn't mean that it is inevitable that the Linux specs would be bad; that only depends on Linus.
  • by Rogerborg ( 306625 ) on Monday October 03, 2005 @05:27AM (#13702725) Homepage

    At a conservative estimate, I've pissed away half of my lifetime development effort dealing with instances where the documentation of an OS, APIs or SDKs doesn't match the actual behaviour. Every time I get sandbagged with that, I wish I could just read the damn source and see what's really going on.

    Linus is quite right that a spec can be useful as a descriptive abstraction, but not as an absolute or proscriptive definition. When you're sitting there at the keyboard and hit a point where the behaviour differs from the spec, it doesn't matter why the spec is wrong, just that it is. Red pen it and move on.

    • by Grab ( 126025 ) on Monday October 03, 2005 @07:04AM (#13703041) Homepage
      Alternatively, maybe you could wish that the fuckwit coders who gave the thing to you had read the documentation and done a little testing to make sure that it works according to the docs, instead of changing things arbitrarily without telling anyone.

      All that "my code is my design" bollocks is just that - bollocks. I can spend a week reading code to find how stuff actually works, or I can spend a few hours reading the spec that says how it *should* work. My job is *not* debugging someone else's shit code, my job is writing something that uses or interfaces to their code. If they've not done their job properly, why should I be expected to be the one to find the holes, just because they can't be arsed testing it properly?

      Grab.
      • by roystgnr ( 4015 ) <roy AT stogners DOT org> on Monday October 03, 2005 @10:24AM (#13704348) Homepage
        Why should I be expected to be the one to find the holes, just because they can't be arsed testing it properly?

        Usually you shouldn't, however you will. It doesn't matter if you're creating a web browser that can't display "broken-but-renders-on-IE" webpages, an IDE driver that may corrupt data on "UDMA-compatible-but-not-compliant" hard drives, or a server process that crashes on corrupt or malicious out-of-spec data: as long as your code is what's interfacing more directly with the user, your code will probably be blamed for the problems. In particular, if one of your competitors has found and worked around the holes, your code will definitely be blamed for the problems.

        It's not fair, but it's life: from a user's point of view it's easier to get new software than to communicate with a different set of people or buy new hardware.
  • A spec whether specific or general, large or small is in some way rooted in theory...Duh. I know. But it's not as obvious as that or this thread wouldn't exist. It seems to me that real hackers, the ones that I have come to respect over the years, simply sat down and built the tools that they needed, at the moment, based on their practical real world at the moment wants and needs. AKA "they scratched their personal itch." They simply had no time for theory.

    Theory, doesn't exist to men like Linus. Or men lik

    • The root word of "spec" is "speculation".

      No. Spec is short for specification, and has nothing to do with speculation, other than that one aims to identify something distinctly, one type of seeing, and the other entails a flight of fancy, another type of seeing.

      Please don't present your own mistaken speculations as facts.
  • Ammo for the enemy (Score:3, Insightful)

    by squoozer ( 730327 ) on Monday October 03, 2005 @05:31AM (#13702736)

    If nothing else comments like this are ammunition for the people who dislike / want to crush Linux (and OSS in general). While I know from experience that the kernel is a quality piece of software and highly reliable if I was new to Linux and considering moving my company over to it comments like this would scare me. It's not that a spec necessarly improves the quality of the software it just improves confidence that the people writting it have a clue about what they hope to acheive.

    I, too, didn't believe in writting specs when I was in college. Most of the projects I worked on were either loner affairs or the group was very small so communication was good. When I got into the commercial world though it was a very different ball game. After working on a couple of projects that failed horribly because half the team was confused about what it was supposed to be doing I realized that a spec is a very useful tool.

    In my experience the better developers didn't need the spec as much as the poorer developers. The good developers almost understood without words what the other good developers would do in a given situation. The problem was no one could predict what the poorer developers would do in a given situation. This led to large chunks of the system not working / intergrating properly (I freely admit there were other serious management problems on these projecs as well) and needing huge amounts of resources to bring them back on track.

    Later projects where there was a spec (even quite an informal spec) produced a better system in less time with fewer resources. I know this sounds like the same old pap that is dished up to every CS student but it really does work on non-trivial projects.

    I certainly believe that the spec can be taken to far though. I have seen some projects never even get off the ground for the want of a quick hacked together bit of proof of concept code. The secret is in hitting that fine line between anarchy and unanarchy (there is no good single word antonym for anarchy so I propose unanarchy).

    Perhaps the kernel only has uber leet hackers working on it. Somehow I doubt that though.

  • Feature creep (Score:3, Insightful)

    by Durzel ( 137902 ) on Monday October 03, 2005 @05:36AM (#13702759) Homepage
    Specification documents are the only thing in the company I work for that stops customers asking for functionality in the 11th hour of development, claiming that "they always meant that it would have that" or "I thought I mentioned that at the first meeting".

    As companies go the one I work for is pretty lax with documentation, but they are very careful that all customer requirements are listed iteratively, and - more importantly - signed off on.

    I have been in situations at work where for whatever reason a specification hasn't been drawn up for a customer; its either been left to informal emails or in the worst cases word-of-mouth/notes written in an initial meeting. In my experience these often end up running on past their deadline as the customer requests more and more esoteric functionality, or design and presentation tweaks that covertly require additional functionality, etc.

    As a rule of thumb as a die-hard programmer I hate documentation, particularly detailed technical specifications which constrict my creativity. That said, where it is necessary I absolutely see the need for it - how else can you constrain the customer to what they originally asked for?
  • by satch89450 ( 186046 ) on Monday October 03, 2005 @05:40AM (#13702768) Homepage

    I once worked on a Standards-writing subcommittee, and ended up being the editor of a proposed standard. I was new to the process at the time. I took the work that was done and completely re-wrote it, from the ground up, according to the published guidelines of the Telecommunications Industry Association (TIA). I then presented my work to the subcommittee.

    "It's too clear. People might actually understand it." I argued that because it was a specification for testing, it should be clear. Yes, I won the argument, but at what cost?

    Over the next few years I watched as more standards were created, edited, published, submitted to the ITU, and eventually turned into Recommendations. When I asked, "what does this section REALLY try to say?" I was told that in order to understand that section I needed to know another piece of the puzzle that wasn't spelled out but was "understood" by "practitioners of the art." In other words, the specification was incomplete...but not according to the rules. I asked why. The answer I got boiled down to one thing: you can't implement the specification without the "stories around the campfire" behind them.

    Put starkly, you can't play unless you join the club.

    Now, in reality, people have taken these less-than-complete specifications and actually made products with them, products that successfully interoperated with those implemented by members of the club. The development time, however, was extended by the need to discover the missing pieces on one's own, or to buy the missing pieces.

    Then there was the story of what eventually became V.80, which I discussed in a Slashdot interview. That particular standard proposal was so bad that I had to vote "no". Again, I ended up rewriting the entire thing so it made sense, and in addition covered not only the corner cases but also future extensions and vendor extensions. It took DAYS to prove that the two versions said technically the same thing (within limits). You could code to mine; the other was almost impossible and "open to interpretation."

    Most specifications (or Standards) are written by partisan participants. It's to their best interests to write these things so that outsiders can't understand them -- be it commercial gain or personal ego. Good spec writing is HARD, and not for beginners. It takes work. It rarely pays anything to write a good specification, especially if the writer views it as a pro-forma task. Just as programmers from several decades ago viewed flow-charting as a useless task.

    Just as people are starting to view Open Source not as a way to lose money but as a way to gain money, perhaps the partisans will see that writing clear, understandable, WORKABLE specifications is in their better interest....or not.

    Given the current state of the art, though, I would tend to agree completely with Linus that specifications, and Standards, that don't provably track with reality deserve not "no", but "HELL NO!"

    • But for FOSS projects, we're theoretically dealing with something in which anyone can participate. "Many eyeballs" and all that. This should mean not only that anyone *can* participate, but that anyone should be *able* to. Now if you're going to put some arbitrary restrictions on it - "we won't tell you what you need to know until you've hung out on IRC long enough to join the club" - then you're majorly restricting your eyeball count. :-/

      This isn't a commercial benefit, and it isn't a benefit to the pro
  • by The Mutant ( 167716 ) on Monday October 03, 2005 @05:40AM (#13702769) Homepage
    But for any type of commercial undertaking, specs are an essential part of the development process.

    Without a spec you won't know what you're being asked to build, or will find it difficult to get customer agreement that what you're delivering meets their need.

    Without a spec you can't estimate, and without accurate estimates you can't insure that you're properly getting paid.
  • by deeny ( 10239 ) on Monday October 03, 2005 @05:42AM (#13702774) Homepage
    There's a wonderful bit about specs from Jason Fried here [37signals.com]. Pretty similar in viewpoint, I think.
  • by RAMMS+EIN ( 578166 ) on Monday October 03, 2005 @05:42AM (#13702775) Homepage Journal
    I'd say that there are good specs as well. A few examples:

    POSIX standardizes programming interfaces for operating systems. It allows easy portability of applications between operating systems, and it's very successful at that. Although non-compliant operating systems (such an Windows) and non-compliant applications (many applications written for the GNU system, and any application that uses functionality not standardized by POSIX) cloud the picture somewhat, POSIX has worked wonders for application portability.

    Common Lisp is a standardization of features found in Lisp systems. It mixed and matched parts from various more or less specialized Lisp systems, and built a generic programming language that is still widely regarded as the best programming language by those who know it. The fact that the language is standardized to a great extent allows code to be easily ported from one implementation to another.

    R5RS is the current state of the standardization effort for the programming language Scheme. Contrary to POSIX and Common Lisp, it aims to standardize not as much as possible, but rather a small common core which can then be extended. This makes Scheme a very useful object for programming language research.

    Comparing these specifications to the alternative of having no specification, I'd have to say they provide definitive advantages. Without POSIX, we'd still be stuck with operating systems having APIs different enough that it might be easier to rewrite applications for each OS, rather than maintain a single codebase with a few platfrom-specifics here and there. Language specifications like Common Lisp and R5RS have enabled a whole slew of implementations, each filling a specific niche, where other languages are often stuck with one implementation, and perhaps a few not-quite-compatible alternatives; and if you want a different niche, you'll have to use a different language.
  • Joel Spolsky (Score:3, Informative)

    by diepan ( 258887 ) on Monday October 03, 2005 @05:45AM (#13702783) Homepage Journal
    Joel Spolsky certainly disagrees [joelonsoftware.com]. And not just in theory [joelonsoftware.com].
  • Reading this article brought to mind another one I saw mentioned on slashdot a while back, about the team that writes the code for the space shuttle's computers. They write what's considered to be the finest code in the world, which essential for running a rocket ship weighing several million pounds and moving at several thousand miles per hour. How do they do it? Specs, lots of specs. According to the article [fastcompany.com]...

    At the on-board shuttle group, about one-third of the process of writing software happens before anyone writes a line of code. NASA and the Lockheed Martin group agree in the most minute detail about everything the new code is supposed to do -- and they commit that understanding to paper, with the kind of specificity and precision usually found in blueprints. Nothing in the specs is changed without agreement and understanding from both sides. And no coder changes a single line of code without specs carefully outlining the change. Take the upgrade of the software to permit the shuttle to navigate with Global Positioning Satellites, a change that involves just 1.5% of the program, or 6,366 lines of code. The specs for that one change run 2,500 pages, a volume thicker than a phone book. The specs for the current program fill 30 volumes and run 40,000 pages.


    Predictable code is good code. You want your code to do x when y happens, and everyone who relies on your code should know what to expect from your code under every circumstance. Kernels are supposed to be boring.

    Specs may suck in some cases; if they do, they're badly written. It's an indictment of the person who wrote that spec, not the concept of specs in general. When I call a function, I expect it to do exactly what its documentation says, and it should comply with the documentation exactly.

    I shouldn't have to read the code just to use it. That defeats the entire purpose of segmenting things out into separate pieces. You might as well be using gotos to write your spaghetti code.
  • by nagora ( 177841 ) on Monday October 03, 2005 @05:53AM (#13702805)
    Given how many commercial, fully spec'ed projects fail or are not even delivered, or are delivered late and over budget, is there really that much evidence that anyone knows how to write complex software at all?

    Maybe "what works" is the best approach, especially for an open-ended project like Linux.

    TWW

  • by MosesJones ( 55544 ) on Monday October 03, 2005 @06:06AM (#13702843) Homepage

    I'd hoped that Linus was refering to "SPECmarks" et al as a bad basis for writing a kernel, but specifications are way off base as being a bad idea for any area of software let alone a kernel.

    Sure BAD specifications are a bad idea, but so is bad code.

    Its also not true to say that a specification can't be detailed and accurate and then implemented directly, IPv4 is a pretty clear specification that I'd be worried if the kernel writers had ignored when they wrote their IP stack.

    I too have seen dreadful code written directly from specifications, normally because there was no design, but I've seen much worse code written from the basis of "I think this therefore it must be right".

    I'm normally a big fan of Linus, but given that many of the major areas of Linux and Open Software are written against specifications (X, Samba, IPv4, 802.11x, BIOS etc etc) its hard to see where Linus is coming from. If two organisations or technologies want to communicate they need an agreed standard and specification on the inter-operation. Any other approach is just lobbing packets and hoping for the best.

  • by mjh ( 57755 ) <mark AT hornclan DOT com> on Monday October 03, 2005 @06:12AM (#13702860) Homepage Journal
    To a certain extent, specs define the parameters around communication. That's why interoperability works: conform with the spec and you can communicate... except when you can't because the spec was built but never implemented, and the spec designers introduced bugs that they would have caught if they'd tried to implement it. Or maybe the spec designers had a specific idea of how things were going to be and didn't anticipate a particular application and it's needs.

    If you accept the premise that specs enable communication, I would ask you this: where is the spec for English? And wherein does it describe things like "fo-shizzle" and "off the hizzle"? Or any other dynamic aspect of language that shows up everywhere in the world. Many people communicate without having a written spec for how that communication is going to work. More over, "fo-shizzle" (et al) is well understood by many people outside of the culture that created it despite it's never having been incorporated into an English language spec. Defining a spec for English is really pointless. The language changes and develops for the specific needs of the people using it.

    It strikes me that the same is true with technology specs. If you don't define them, people will make incremental improvements to stuff without having to worry about hearing, "Hey! That's out of spec!" Put another way, specs are the antithesis of incremental change by individuals who need those changes. Specs are cathedral, not bazaar.

    Are there spec success stories? Sure: IP. On the other hand how long have we been trying to get greater acceptance of IP6? And why do we want IP6 at all? Because no one is willing to incrementally change IP4.

    Put another way: specs are a form of governance. They're the central group saying "thou shalt". That's fine if (and only if) that central group knows every possible implication of every decision that they make. Linus seems to be trying to avoid that. He seems to have faith that a diverse group of people will do a better job of figuring out what works best over time than he alone, or this particular set of kernel devs, could devise today.

    Of course, I could be wrong.
  • by beforewisdom ( 729725 ) on Monday October 03, 2005 @06:20AM (#13702884)
    The open source BSD distributions have a reputation for being better on the back end then gnu/linux ( I honestly do not know if that is true or not ).

    What do they do? Do they design what they are going to do and use specs before they do it?

    Do they have a single person who calls the shots on how their kernel is done?
    • Put simply, they implement less. Less features, mostly - usually to the point of anemia. So while the BSDs might be more likely to work, the GNU platforms are more likely to actually do what you want. That's probably got a lot to do with why the GNU platforms have been a qualified success, while the BSDs have mostly been irrelevent in the market. After all, it's easier to fix a bug than it is to implement new features.
  • by Anonymous Coward on Monday October 03, 2005 @06:38AM (#13702932)
    From one of my past lives, when I designed computer systems in automobiles, before my current stint as a grad student, I can describe how we used specs. Some of our business folks who had some basic training as engineers would get together with our customers folks who had basic engineer training, and hammer our a 100+ page spec document covering everything from basic operating precepts to acceptable failure modes. They would use this document as a means of discussing what they thought they wanted to buy, and what we thought we could supply for them, and then determine a cost from this proposed spec.

    After enough words had been passed back and forth, both sides would agree on a version of the spec, money would be passed around, and hardware design engineers and software engineers from all over the world would get to work. At this point, the spec would be skimmed, people would get a rough idea of what everyone wanted, and a couple hundred of the first prototype version would be cobbled together.

    Testers and verification people on both sides of the fence would look at this thing, first against the spec and make sure it included everything that was talked about, and then in the system to make sure that it would work they way they needed it to. This is the closest that the design ever got to the spec. After this point, everyone would start noticing places where the spec was either too rigid to be followed cost-effectively, or just plain wrong for our customer. Since rewriting a spec is a ton of work, it never got done, and in the end was just a basis for verification folks to look at the design and complain that it didn't work they way that they thought it should. I guess someone should have included them in the "cool peoples idea passing club", but, neah.
  • by Phaid ( 938 ) on Monday October 03, 2005 @06:44AM (#13702954) Homepage
    This article isn't about a general comment as to how software should be written; it's about implementing protocols. And protocols have to be defined by specifications -- you can't simply have some vague spec that says "it should do approximately this"; you have to precisely define the language of the protocol so that everyone knows what they are implementing.

    That being said, Ted Ts'o makes the most important comment in the discussion: This is the reason for the IETF Maxim --- be conservative in what you send, liberal in what you will accept. In other words, when your code talks over the protocol, be minimalist and adhere strictly to the spec; but if you can implement your code in such a way that it is tolerant of small variances (e.g. combinations of commands which make sense but are not allowed by a strict interpretation of the protocol spec) and can do it safely, then you should. This is the approach I've always taken, and having done things like implement a major printer vendor's IPP server, I have found this approach makes interoperability and compatibiltiy a lot easier and less painful to achieve.

    So at least from that standpoint, I think I can see what Linus is trying to say; at any rate I agree that strictly adhering to a spec simply for reasons of mathematical correctness is not always the most productive route.
  • Types of specs... (Score:3, Interesting)

    by PhotoGuy ( 189467 ) on Monday October 03, 2005 @06:46AM (#13702960) Homepage
    There are different types of specs. Big, formal ones, done in a vacuum, and less rigid ones. I'm a big believer in properly defining "interfaces" between different chunks of code or modules, and effectively, that's a spec. Again, I'm sure Linus uses defined interfaces all the time.

    Linus codes to specs every day; the Unix/Posix API was a useful one, certainly; he didn't just go inventing his own system calls, he used standards.

    I can understand why he wouldn't want to be arbitrarily constrained in kernel development by being restricted by a spec. But perhaps if he applied a bit more of this, I wouldn't have a __foo_bar_blat when compiling and loading a module with a new kernel. (And after digging, finding out that the kernel system call is no longer present. Arrrrgh!!!) This, IMHO, is one of Linux's biggest weaknesses.
  • by master_p ( 608214 ) on Monday October 03, 2005 @06:53AM (#13702988)
    I have been working with specs-driven projects for the most part of my professional career, and I can tell you that Linus has a point.

    If specs were 100% accurate, then there would not be a need to write the code, because the specs could be automatically translated to code (we are talking about 100% accuracy here, not 99.999999%). But specs can realistically never be 100% accurate...it is the missing part from the specs that causes headaches.

    For example, I have worked in a project that required conversions between coordinate systems: UTM to geodetic, geodetic to local cartesian, local cartesian to target, etc. The user expected to edit UTM coordinates in the GUI, but the specs for UTM coordinates where never mentioned in the Software Requirements Specification. So we searched the internet, found out what 'UTM' is, and coded the relevant functions.

    You know what? the specs talked about UTM coordinates, but they actually meant MGRS! UTM means 'universal transverse mercator' whereas MGRS means 'military grid reference system'. Although the concept between the systems is the same (the Earth's surface is divided into rectangular blocks), the two systems have different calculations.

    When we released the application to our customer, they freaked out seeing UTM coordinates, and of course they refused to pay. Then we pointed out that the specs talked about UTM coordinates, and they (thank God) admitted their mistake, paid us, and gave us time to change the application from UTM to MGRS.

    But the application has never been correct 100% (from that point on) until recently handling MGRS coordinates, because it was very difficult to successfully change something so fundamental and yet so missing from the specs(we are talking about 160,000 lines of C++ code).

    Do you want another example? in the same application, the program should display a DTED (digital terrain elevation data) file, i.e. a map created out of a file of elevation data. The specs did not say anything about the map display respecting the carvature of Earth. So we went out and implemented a flat model, where each pixel on the screen was converted linearly to a X, Y coordinate on the map.

    Guess what? they meant the map display to be 'curved', i.e. respect the Earth's carvature. The specs did not say anything, until the application was connected to another application that produced the video image of the battlefield using OpenGL (and of course, since it was to be in 3d, the presented map was 'curved').

    The result is that the application still has some issues regarding coordinate conversions.

    After all these (and many more...) I am not surprised at all that some certain space agency's probe failed to reach Mars because of one team using the metric system and the other team using the imperial system. Even for NASA that they write tons of spec and they double- and triple- check them using internal and external peer review, specs were useless at the end.

    So Linus has a point...
    • Not 100% true. (Score:5, Insightful)

      by Paul Crowley ( 837 ) on Monday October 03, 2005 @08:22AM (#13703410) Homepage Journal
      If specs were 100% accurate, then there would not be a need to write the code, because the specs could be automatically translated to code (we are talking about 100% accuracy here, not 99.999999%).

      This is not true in general. It's quite straightforward to spec out a program that solves the Halting Problem, for example, but rather harder to code one. And there are issues to do with optimization and so forth that would not appear in a specification.

      Nonetheless, there's a great deal of truth in what you say - for most real-world programs, a 100% complete formal specification of what they had to do would not be much shorter than the program itself. This is why agile development methodologies make sense.
  • Mixed feelings (Score:3, Insightful)

    by Kaldaien ( 676190 ) on Monday October 03, 2005 @07:09AM (#13703061)
    I can see where Linus is coming from; on the other hand, I can see where the statement is fundamentally flawed. The best example I can give is a 3D engine. I have worked on an OpenGL engine for almost 4 years now, and certain aspects of the engine development (namely shader architecture) were purposely left without a formal 'spec'. 3D Hardware changes more rapidly than one can build a 3D engine from scratch. If the entire engine followed a 'spec' from day one, it would be obsolete by the time it were finished. If you had asked someone four years ago what NVIDIA and ATI would be working on right now, they could never guess that vertex and pixel shaders were beginning to merge (both on the hardware (shared pipelines) and in functionality (pixel shaders have front/back face information and vertex shaders can perform texture lookups)). They could have made assumptions that caused them to code themselves into a wall so to speak and which prevents them from ever utilizing the features of Shader Model 3.0.

    On the other hand, there are other teams who work on gameplay and network development. For the most part the network developers can develop a 'spec' and reasonably follow it. The gameplay mechanics follow a constantly revised 'spec', and probably the only one the consumers who play the game are ever familiar with. In this aspect of development, a 'spec' is _required_ to complete the project in a reasonable timeframe.
  • by YoungHack ( 36385 ) on Monday October 03, 2005 @07:32AM (#13703170)

    My experience getting into the kernel was usually motivated by trying to write user space programs. I finally learned what people meant when they say Linux isn't well documented.

    I could care less whether there's a spec, but if I'm going to use an API, I have to know what it does. The ALSA (advanced Linux sound architecture) is absolutely the worst. The documentation is full of entries that have the form:

    SetFooOn()

    This function sets foo on.

    Now the thing I don't understand is this. If you went out to write a really big and important piece of software, wouldn't you want people to use it? What's the point of an undocumented API? It makes no sense whatsoever. You either want someone to use it, and you tell them how, or you make the methods private (i.e. not part of the API)?

    Rant over.

  • by Theovon ( 109752 ) on Monday October 03, 2005 @07:43AM (#13703217)
    We wrote the Open Graphics Project spec not based on purely abstract theory but based on the experiences and needs of the community. Purely for the sake of survival, I made it clear that there should be nothing in the design which could not be justified by common needs. Based on that, we developed a SPEC.

    Maybe Linus is having a language-barrier problem, but a spec is just a description of something, albeit somewhat formalized. That means you could write a spec retroactively. We could write a spec for the Linux kernel as it is right now. If we were to do that, would Linus abandon Linux? It wouldn't be THAT hard to make it accurate.

    Frankly, I can't write anything without SOME sort of spec. Often, those specs are contained completely within my brain, but I nevertheless must develop a coherent concept of what it is I'm going to build and what its pieces are. When I write a document, I often start out with some sort of outline. And when I write code, I have to decompose it into functions.

    If a spec is any coherent description of something you make, then Linus uses specs all the time, and he's just blowing smoke out his ass.

    He's complaining about specs because they're usually done badly. JUST ABOUT EVERYTHING IS USUALLY DONE BADLY. Should we say that all operating systems are bad just because Windows sucks? Should we say all cars are bad just because the Ford Taurus is designed to last only 5 years? Should we do away with TV just because of shows like "Two guys, a girl, and a pizza shop" or "Survivor"?

    Linus is forgetting that Linux is based on specs, Honda makes reliable cars, and Star Gate SG1 is on on SOME channel just about all day.
  • by kronocide ( 209440 ) on Monday October 03, 2005 @08:08AM (#13703341) Homepage Journal
    This is obviously the rant of a person who has never programmed for an actual client (a human one). 99% of the time a spec is the understanding between the user and the provider, whoever they are. So yes, Torvalds is right that they are mainly for talking about software, but unless you are writing your own operating system on your free time, you have to be able to talk about it or you will implement something other than what the client thought they paid you for, and then they get sour. Specifications are about understanding and communication, when not the whole universe is inside one person's head.

    Moreover, Torvalds doesn't really seem to know what science is. There just is no criterion that a scientific theory has "no holes." It doesn't work that way.
  • Freenet (Score:3, Interesting)

    by N3wsByt3 ( 758224 ) on Monday October 03, 2005 @09:20AM (#13703829) Journal
    Specs are crap?

    I guess that's why the Freenetproject never had good specs! ;-)

    And it pretty much seems a worthwile pursuit, seen the current almost-specless development of the 0.7 version.

    Ok, the start [freenethelp.org] was there, but it doesn't seem to get any further, even after several months.
  • by Junta ( 36770 ) on Monday October 03, 2005 @10:03AM (#13704154)
    Most people seem to be assuming he is dismissing software design specs, which has nothing to do with his discussion.

    His puported viewpoint is to be taken in the context of a hardware vendor spec on how to interface with hardware. There is a significant amount of truth in this. Hardware vendors will live by specs when they need to. I.e. they adhere to PCI specs, they adhere to drive-controller specs. They are careful about areas where they communicate on a bus or channel to arbitrary other vendor hardware.

    However, the interface between the OS and the piece of equipment will almost always have poor specs. They are usually designed at the start of the design of things, and then while working the reality shifts from the spec. Unless going for some sort of standardization (i.e. IETF and such), the spec is rarely updated even in the face of significant change. At that point the reference implementation is the only thing anyone is maintaining and the only thing that particularly matters.

    On the other hand, at least you know where they are coming from. Ultimately, an ideal world has a vendor releasing both specs and their reference implementation.

Put your Nose to the Grindstone! -- Amalgamated Plastic Surgeons and Toolmakers, Ltd.

Working...