Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Linux Software

Linus Says No to 'Specs' 540

auckland map writes to tell us about an interesting debate that is being featured on KernelTrap. Linus Torvalds raised a few eyebrows (and furrowed even more in confusion) by saying "A 'spec' is close to useless. I have _never_ seen a spec that was both big enough to be useful _and_ accurate. And I have seen _lots_ of total crap work that was based on specs. It's _the_ single worst way to write software, because it by definition means that the software was written to match theory, not reality."
This discussion has been archived. No new comments can be posted.

Linus Says No to 'Specs'

Comments Filter:
  • Detailed specs... (Score:3, Interesting)

    by jemnery ( 562697 ) on Monday October 03, 2005 @06:06AM (#13702657)
    Detailed specs are useless. A broad spec that defines the general features, who the damn users are and what they need to achieve is far from useless. Let the intelligent software developers figure out the details, but don't let them lose sight of the general direction they should be taking.
  • Theory (Score:5, Interesting)

    by StonePiano ( 871363 ) on Monday October 03, 2005 @06:11AM (#13702670) Homepage
    Who was it that said:

    In theory, practice and theory are the same. In practice, they are not.
  • by dorkygeek ( 898295 ) on Monday October 03, 2005 @06:14AM (#13702681) Journal
    Specs are not best for software whose features are to grow with time, and where nobody ones what people want to add more. Specs are best when you have a fixed set of requirements, which you have to meet in order to complete your work.

    Still, specs may be useful for example to identify certain aspects of a Linux sub-system. But it may not be desirable to have a full spec defining all the goals of Linux, because these goals are a rapic moving target and therefore steadily changing. Of course, there are some features which are built to stay, but specifying specific features in detail while other objectives are changing or even unknown, is hard and may not give the desired results.
  • You read the code (Score:3, Interesting)

    by Rogerborg ( 306625 ) on Monday October 03, 2005 @06:21AM (#13702701) Homepage
    That's pretty much what it always comes back to with Linus.
  • by Anonymous Coward on Monday October 03, 2005 @06:25AM (#13702717)
    "...such as Web standards, specs are necessary, and those who don't follow these specs make crap softwares/browsers!"

    I don't know a single browser that fully supports most of the webs official specs. This is the point where most people would say, "except, maybe lynx." But, even there, it doesn't render to spec because of its limitations. Yeah, it's a guideline when implimenting these features, but it is rarely "written to spec."

        Man, I use firewire and USB drives at work all the time. You'd think if they have the logo on the side, they would properly support the spec. They don't. One drive may work fine in Windows, but barf on MacOS--or Linux. It is all about how the manufacturer and OS implimented the spec.
  • specs and designs (Score:5, Interesting)

    by idlake ( 850372 ) on Monday October 03, 2005 @06:27AM (#13702724)
    Linus does code to specs: the kernel is intended to comply with all sorts of formal and informal specs, and its developers pay attention.

    What is missing is people writing and committing to specs for some important kernel internal interfaces and functionality. This attitude goes hand in hand, of course, with the lack of stable internal interfaces within the Linux kernel and is one of the major reasons why the kernel source has bloated to such a humungous size and why every kernel release needs to include all the accumulated garbage of the past decade. If internal kernel interfaces were specified and committed to for each major version, then driver and module development could be separated from the rest of the kernel.

    Of course, Linus is right in one area: most specs are useless. There are two primary reasons for that. Either, the spec is poorly written; there are lots of those. Or, the spec describes a bad design; there are many more of those. Many of the original UNIX design documents were exemplary specs: they told you concisely what you could and could not rely on. On the other hand, many recent standards (like HTML or SOAP) are examples of well-written specs that are bad specs because the underlying designs suck. But the fact that many specs are bad doesn't mean that it is inevitable that the Linux specs would be bad; that only depends on Linus.
  • by Rogerborg ( 306625 ) on Monday October 03, 2005 @06:27AM (#13702725) Homepage

    At a conservative estimate, I've pissed away half of my lifetime development effort dealing with instances where the documentation of an OS, APIs or SDKs doesn't match the actual behaviour. Every time I get sandbagged with that, I wish I could just read the damn source and see what's really going on.

    Linus is quite right that a spec can be useful as a descriptive abstraction, but not as an absolute or proscriptive definition. When you're sitting there at the keyboard and hit a point where the behaviour differs from the spec, it doesn't matter why the spec is wrong, just that it is. Red pen it and move on.

  • Reading this article brought to mind another one I saw mentioned on slashdot a while back, about the team that writes the code for the space shuttle's computers. They write what's considered to be the finest code in the world, which essential for running a rocket ship weighing several million pounds and moving at several thousand miles per hour. How do they do it? Specs, lots of specs. According to the article [fastcompany.com]...

    At the on-board shuttle group, about one-third of the process of writing software happens before anyone writes a line of code. NASA and the Lockheed Martin group agree in the most minute detail about everything the new code is supposed to do -- and they commit that understanding to paper, with the kind of specificity and precision usually found in blueprints. Nothing in the specs is changed without agreement and understanding from both sides. And no coder changes a single line of code without specs carefully outlining the change. Take the upgrade of the software to permit the shuttle to navigate with Global Positioning Satellites, a change that involves just 1.5% of the program, or 6,366 lines of code. The specs for that one change run 2,500 pages, a volume thicker than a phone book. The specs for the current program fill 30 volumes and run 40,000 pages.


    Predictable code is good code. You want your code to do x when y happens, and everyone who relies on your code should know what to expect from your code under every circumstance. Kernels are supposed to be boring.

    Specs may suck in some cases; if they do, they're badly written. It's an indictment of the person who wrote that spec, not the concept of specs in general. When I call a function, I expect it to do exactly what its documentation says, and it should comply with the documentation exactly.

    I shouldn't have to read the code just to use it. That defeats the entire purpose of segmenting things out into separate pieces. You might as well be using gotos to write your spaghetti code.
  • by MosesJones ( 55544 ) on Monday October 03, 2005 @07:06AM (#13702843) Homepage

    I'd hoped that Linus was refering to "SPECmarks" et al as a bad basis for writing a kernel, but specifications are way off base as being a bad idea for any area of software let alone a kernel.

    Sure BAD specifications are a bad idea, but so is bad code.

    Its also not true to say that a specification can't be detailed and accurate and then implemented directly, IPv4 is a pretty clear specification that I'd be worried if the kernel writers had ignored when they wrote their IP stack.

    I too have seen dreadful code written directly from specifications, normally because there was no design, but I've seen much worse code written from the basis of "I think this therefore it must be right".

    I'm normally a big fan of Linus, but given that many of the major areas of Linux and Open Software are written against specifications (X, Samba, IPv4, 802.11x, BIOS etc etc) its hard to see where Linus is coming from. If two organisations or technologies want to communicate they need an agreed standard and specification on the inter-operation. Any other approach is just lobbing packets and hoping for the best.

  • by mjh ( 57755 ) <mark@ho[ ]lan.com ['rnc' in gap]> on Monday October 03, 2005 @07:12AM (#13702860) Homepage Journal
    To a certain extent, specs define the parameters around communication. That's why interoperability works: conform with the spec and you can communicate... except when you can't because the spec was built but never implemented, and the spec designers introduced bugs that they would have caught if they'd tried to implement it. Or maybe the spec designers had a specific idea of how things were going to be and didn't anticipate a particular application and it's needs.

    If you accept the premise that specs enable communication, I would ask you this: where is the spec for English? And wherein does it describe things like "fo-shizzle" and "off the hizzle"? Or any other dynamic aspect of language that shows up everywhere in the world. Many people communicate without having a written spec for how that communication is going to work. More over, "fo-shizzle" (et al) is well understood by many people outside of the culture that created it despite it's never having been incorporated into an English language spec. Defining a spec for English is really pointless. The language changes and develops for the specific needs of the people using it.

    It strikes me that the same is true with technology specs. If you don't define them, people will make incremental improvements to stuff without having to worry about hearing, "Hey! That's out of spec!" Put another way, specs are the antithesis of incremental change by individuals who need those changes. Specs are cathedral, not bazaar.

    Are there spec success stories? Sure: IP. On the other hand how long have we been trying to get greater acceptance of IP6? And why do we want IP6 at all? Because no one is willing to incrementally change IP4.

    Put another way: specs are a form of governance. They're the central group saying "thou shalt". That's fine if (and only if) that central group knows every possible implication of every decision that they make. Linus seems to be trying to avoid that. He seems to have faith that a diverse group of people will do a better job of figuring out what works best over time than he alone, or this particular set of kernel devs, could devise today.

    Of course, I could be wrong.
  • by beforewisdom ( 729725 ) on Monday October 03, 2005 @07:20AM (#13702884)
    The open source BSD distributions have a reputation for being better on the back end then gnu/linux ( I honestly do not know if that is true or not ).

    What do they do? Do they design what they are going to do and use specs before they do it?

    Do they have a single person who calls the shots on how their kernel is done?
  • by ajs ( 35943 ) <{ajs} {at} {ajs.com}> on Monday October 03, 2005 @07:30AM (#13702911) Homepage Journal
    POSIX, VESA and even ttys are all examples of specificatons that sought to unify existing practice. The practice came first, then the theory. If you want an example that goes the other way, you would have to look to something like IP, which was created as a specificaton along with the first implementations.

    Linus is correct, though. Specs are rarely useful breasts up-front. Standardization of existing practice is often useful, but that's another beast.
  • by Anonymous Coward on Monday October 03, 2005 @07:38AM (#13702932)
    From one of my past lives, when I designed computer systems in automobiles, before my current stint as a grad student, I can describe how we used specs. Some of our business folks who had some basic training as engineers would get together with our customers folks who had basic engineer training, and hammer our a 100+ page spec document covering everything from basic operating precepts to acceptable failure modes. They would use this document as a means of discussing what they thought they wanted to buy, and what we thought we could supply for them, and then determine a cost from this proposed spec.

    After enough words had been passed back and forth, both sides would agree on a version of the spec, money would be passed around, and hardware design engineers and software engineers from all over the world would get to work. At this point, the spec would be skimmed, people would get a rough idea of what everyone wanted, and a couple hundred of the first prototype version would be cobbled together.

    Testers and verification people on both sides of the fence would look at this thing, first against the spec and make sure it included everything that was talked about, and then in the system to make sure that it would work they way they needed it to. This is the closest that the design ever got to the spec. After this point, everyone would start noticing places where the spec was either too rigid to be followed cost-effectively, or just plain wrong for our customer. Since rewriting a spec is a ton of work, it never got done, and in the end was just a basis for verification folks to look at the design and complain that it didn't work they way that they thought it should. I guess someone should have included them in the "cool peoples idea passing club", but, neah.
  • by squoozer ( 730327 ) on Monday October 03, 2005 @07:43AM (#13702950)

    I'm not sure all of CS is akin to maths some is closer to engineering IMHO. Having said that I don't think CS is a science. Science to my mind discovers things about the universe around us. This description perfectly fits physics, chemistry and biology as well as their off shoots such as astronomy. CS on the other hand hasn't told us one single truth about the universe. Hence it is not science. It has, however, applied the truths we have discovered about the universe which oddly enough is the definition of engineering. CS just jumped on the science bandwagon because it got more funding / sounded better.

  • by Phaid ( 938 ) on Monday October 03, 2005 @07:44AM (#13702954) Homepage
    This article isn't about a general comment as to how software should be written; it's about implementing protocols. And protocols have to be defined by specifications -- you can't simply have some vague spec that says "it should do approximately this"; you have to precisely define the language of the protocol so that everyone knows what they are implementing.

    That being said, Ted Ts'o makes the most important comment in the discussion: This is the reason for the IETF Maxim --- be conservative in what you send, liberal in what you will accept. In other words, when your code talks over the protocol, be minimalist and adhere strictly to the spec; but if you can implement your code in such a way that it is tolerant of small variances (e.g. combinations of commands which make sense but are not allowed by a strict interpretation of the protocol spec) and can do it safely, then you should. This is the approach I've always taken, and having done things like implement a major printer vendor's IPP server, I have found this approach makes interoperability and compatibiltiy a lot easier and less painful to achieve.

    So at least from that standpoint, I think I can see what Linus is trying to say; at any rate I agree that strictly adhering to a spec simply for reasons of mathematical correctness is not always the most productive route.
  • Types of specs... (Score:3, Interesting)

    by PhotoGuy ( 189467 ) on Monday October 03, 2005 @07:46AM (#13702960) Homepage
    There are different types of specs. Big, formal ones, done in a vacuum, and less rigid ones. I'm a big believer in properly defining "interfaces" between different chunks of code or modules, and effectively, that's a spec. Again, I'm sure Linus uses defined interfaces all the time.

    Linus codes to specs every day; the Unix/Posix API was a useful one, certainly; he didn't just go inventing his own system calls, he used standards.

    I can understand why he wouldn't want to be arbitrarily constrained in kernel development by being restricted by a spec. But perhaps if he applied a bit more of this, I wouldn't have a __foo_bar_blat when compiling and loading a module with a new kernel. (And after digging, finding out that the kernel system call is no longer present. Arrrrgh!!!) This, IMHO, is one of Linux's biggest weaknesses.
  • by master_p ( 608214 ) on Monday October 03, 2005 @07:53AM (#13702988)
    I have been working with specs-driven projects for the most part of my professional career, and I can tell you that Linus has a point.

    If specs were 100% accurate, then there would not be a need to write the code, because the specs could be automatically translated to code (we are talking about 100% accuracy here, not 99.999999%). But specs can realistically never be 100% accurate...it is the missing part from the specs that causes headaches.

    For example, I have worked in a project that required conversions between coordinate systems: UTM to geodetic, geodetic to local cartesian, local cartesian to target, etc. The user expected to edit UTM coordinates in the GUI, but the specs for UTM coordinates where never mentioned in the Software Requirements Specification. So we searched the internet, found out what 'UTM' is, and coded the relevant functions.

    You know what? the specs talked about UTM coordinates, but they actually meant MGRS! UTM means 'universal transverse mercator' whereas MGRS means 'military grid reference system'. Although the concept between the systems is the same (the Earth's surface is divided into rectangular blocks), the two systems have different calculations.

    When we released the application to our customer, they freaked out seeing UTM coordinates, and of course they refused to pay. Then we pointed out that the specs talked about UTM coordinates, and they (thank God) admitted their mistake, paid us, and gave us time to change the application from UTM to MGRS.

    But the application has never been correct 100% (from that point on) until recently handling MGRS coordinates, because it was very difficult to successfully change something so fundamental and yet so missing from the specs(we are talking about 160,000 lines of C++ code).

    Do you want another example? in the same application, the program should display a DTED (digital terrain elevation data) file, i.e. a map created out of a file of elevation data. The specs did not say anything about the map display respecting the carvature of Earth. So we went out and implemented a flat model, where each pixel on the screen was converted linearly to a X, Y coordinate on the map.

    Guess what? they meant the map display to be 'curved', i.e. respect the Earth's carvature. The specs did not say anything, until the application was connected to another application that produced the video image of the battlefield using OpenGL (and of course, since it was to be in 3d, the presented map was 'curved').

    The result is that the application still has some issues regarding coordinate conversions.

    After all these (and many more...) I am not surprised at all that some certain space agency's probe failed to reach Mars because of one team using the metric system and the other team using the imperial system. Even for NASA that they write tons of spec and they double- and triple- check them using internal and external peer review, specs were useless at the end.

    So Linus has a point...
  • by HuguesT ( 84078 ) on Monday October 03, 2005 @07:54AM (#13702995)
    Code *as* specs

    In a few projects I've worked for, we delivered code *as* specs. I.e. an implementation of some library together with documentation.

    The client took the working code and the documentation, and then re-coded the entire library to their standards from scratch, using the implementation as benchmark. At least that's what they'd said they'd done... I believe they took the code lock, stock and barrel and wrote a tiny interface layer on top, but I don't want to know this.
  • by indifferent children ( 842621 ) on Monday October 03, 2005 @08:19AM (#13703123)
    in a commercial environment, defining the spec is really writing a contract, which protects both the customer and the editor

    But since the spec is almost certainly incomplete, and likely wrong in several spots, what you have is a contract that means that a judge is likely to rule in your favor, but a customer who hates your guts and a lot of industry buzz about how you are being sued by one of your customers. Get in bed with your customer and discover which of their needs you can meet on a continuing basis. That's a better recipe for success that being 'technically correct'.

  • hardware specs (Score:4, Interesting)

    by LatePaul ( 799448 ) on Monday October 03, 2005 @08:21AM (#13703130)
    That's because Linus is talking about hardware specs. His view is that it's better to trust reality (how the device actually behaves) than a spec (how it's documented to behave). And he's right about that.

    Outside the world of OS kernels there are many software projects where the 'reality' is much more changeable, much less solid. Reality in the case of a Purchasing application or your KDE/Gnome desktop applet or the latest FPS game is likely to be a case of 'what the customer/user wants'. That's something you really need to pin down. Doesn't mean it can't change but it needs to be clearly defined.
  • by YoungHack ( 36385 ) on Monday October 03, 2005 @08:32AM (#13703170)

    My experience getting into the kernel was usually motivated by trying to write user space programs. I finally learned what people meant when they say Linux isn't well documented.

    I could care less whether there's a spec, but if I'm going to use an API, I have to know what it does. The ALSA (advanced Linux sound architecture) is absolutely the worst. The documentation is full of entries that have the form:

    SetFooOn()

    This function sets foo on.

    Now the thing I don't understand is this. If you went out to write a really big and important piece of software, wouldn't you want people to use it? What's the point of an undocumented API? It makes no sense whatsoever. You either want someone to use it, and you tell them how, or you make the methods private (i.e. not part of the API)?

    Rant over.

  • by Compholio ( 770966 ) on Monday October 03, 2005 @08:37AM (#13703195)
    Maybe Linus and his chief lieutenants can't write a useful spec, but that doesn't mean nobody can.

    Linus and friends write documentation, not specs. Specs tell you how a program "should" work and tend to be full of BS unless they're at very high level, documentation tells you how a program "does" work. I imagine that Linus and friends are more interested in telling you how it does work than how it should work, and I tend to agree with that perspective.
  • by cybersekkin ( 536109 ) on Monday October 03, 2005 @08:53AM (#13703272)
    To the earlier poster who compared writing code to airplane design. basically remember that airplane dynamics and stresses and struture is dictated on by a series of "specs" that were worked out by trail and error. When they found something they do not understand you typically see the wonderful "constant" which goes like first yo multiply this form by the factor of the weight and airpraesure and then forget all that and multiply by the this and magiccaly you are now in Oz. (honestly when I took calc II years ago my professor did that I said "huh" and got back a "trust me" response. (real good for a scientific understanding-no explanation-just trust me)

          In truth building software to specs is almost worse than useless. design to Interfaces(YES), but when making interop software you cannot and should not rely on how another piece does its calcs just that it does its part and spits out a response (or does its part) if it returns a structure you should use the proper accessors to read the structures data as structures change from iteration to iteration. Linus is correct in that specs are wrong in fact after a few more iterative releases they usually are out of sync and cause more problems than they could have ever fixed. If programming was like an engineering project where we had a base set to use (dynamics of metals and properties of power systems) then great specs would be good, but in truth when getting past base libs we have very little of a solid unchanging base to base a spec on. hence why computer programming is less science and more "art of"
  • by Morgaine ( 4316 ) on Monday October 03, 2005 @08:53AM (#13703274)
    Bertrand Russel tried to put mathematics on an immovable foundation of theory and logic, sadly it turned out to be impossible.

    I don't think that you got very far with Bertrand Russell.

    The higher-order issues he identified caused just a temporary hiccup in the development of logic. While undoubtedly fundamental, that problem paraphrases best as "There are hidden depths to this", rather than as "All is lost".

    Godel applies, as always. You don't apply a theory outside of its domain of discourse, not if you know what you're doing anyway.

    Russell showed that the domain of logic gets tangled if you use it to think about itself. Well (with hindsight) that is no surprise at all. The expert logician recognizes the necessary boundary, and virtualizes the outer domain before it can be handled by the inner domain logic.

    Russell is doing just fine, thank you. Almost the entirety of mankind's technological world is founded on the logic which you describe as "impossible".
  • Re:Detailed specs... (Score:4, Interesting)

    by arkanes ( 521690 ) <<arkanes> <at> <gmail.com>> on Monday October 03, 2005 @09:13AM (#13703364) Homepage
    POSIX is exactly the same boat. I agree totally with Linus here - there has never been a spec that wasn't flawed, incomplete, or otherwise broken.

    In the business world, it generally goes something like this: Customer provides a spec (normally a word document written up by the one person who actually knows how they do business there, and then distorted beyond recognition by 2 months of committee meetings). You read the spec, mutter to yourself about what the hell they're thinking, and begin to write code to match the spec. When you deliver the first iteration, your customer suddenly identifies a need which they didn't spec and which totally breaks your design.

    This is so common that an entire design methodology was built around it happening, and lowering the amount of up front information presented because its always wrong and incomplete.

  • by YU Nicks NE Way ( 129084 ) on Monday October 03, 2005 @09:38AM (#13703541)
    it's not possible to look at software development as a physical manufacturing process. We're much closer to art.
    I like to tell my younger colleagues that "We aren't engineers, and we aren't artists. We are craftsmen and -women. We make useful devices through the skills we have. We make pots and pans."
  • by sdokane ( 587156 ) on Monday October 03, 2005 @09:53AM (#13703662)
    (1) Copying the engineering profession does not mean that suddenly all the problems associated with software will disappear. Engineering systems are frequently late and over-budget. I can name a few hi-profile examples: the Millennium Dome (UK), the Jubilee Underground Line extension (London), the "wobbly" bridge (London), the Sydney Opera House. The first 3 were "vanilla" projects - there was no real excuse for failure.

    (2) Specs don't correspond will reality because they frequently use hand-waving to achieve functionality. That's what high level design does. If it filled in all the detail, it would not be high-level design. High-level design nearly always misses detail that emerges in implementation, because the only way to discover that detail is (1) to do the implementation, or (2) have a (non-human) ability to see the consequences of every design decision.

    (3) If a design methodology is complete enough so that it does not use hand-waving, then the level of design has the same level of complexity as the implementation. Having used UML for many years, I have seen it grow so that now UML editors have so many icons, shapes, dialogs, that quite frankly I'd rather go back to code. The spec (and the language used for the spec), if it has the same level of detail, only adds complexity by hiding it and organising it in an artificial manner.

    The future of software, I believe, lies in good libraries. They encompass the experience of programmers in particular domains. I use the example of ASP.NET. Applications written in ASP.NET are more robust and faster to produce that those written in ASP, and the ASP application are more stable and faster to develop than early applications written in CGI (generally). The same holds true for applications developed using different generations of Java etc.

  • by klaun ( 236494 ) on Monday October 03, 2005 @10:13AM (#13703790)
    POSIX, like most specs, is designed to allow software to be both closed-source, and compatible. When your code is available, the sole traditional reason for specs is gone.
    I'm under the impression that this type of argument is usually refered to as the "straw man." You redefine the issue ("there is only one reason for having specs and it is this") to something that you can refute, refute it, and then declare victory.

    But saying open code is the solution for specs doesn't seem quite complete. So are we saying whoever gets there first wins? I.e. if you are the first to define a protocol in code, everyone else must follow what you've said? Which is okay as long as you are Open Source, but abject evil if you are closed source. (see Microsoft discussions, et al.)

    Or perhaps we might call upon the invisible hand of the market to decide. Saying everyone code their solution to the problem... and whatever is used becomes the de facto standard. But doesn't this imply that a lot of people will spend a lot of time creating code that will never be used? Is this really the most efficient process?

    And given the reality of the world where open source and closed source software must interact, what then?

    I would posit another reason for a spec that you implicitly dismiss. That the goal of development can be agreed upon before any development starts so that more than one solution can be available at the same time. In fact for closed-source code I think this is as strong a reason as the one you posit. Companies don't all want to come up with their own idea of LDAPv4, code it, launch it, find they did not "win" in the market, and then redevelop to the version of LDAPv4 that did win.

  • Freenet (Score:3, Interesting)

    by N3wsByt3 ( 758224 ) on Monday October 03, 2005 @10:20AM (#13703829) Journal
    Specs are crap?

    I guess that's why the Freenetproject never had good specs! ;-)

    And it pretty much seems a worthwile pursuit, seen the current almost-specless development of the 0.7 version.

    Ok, the start [freenethelp.org] was there, but it doesn't seem to get any further, even after several months.
  • by duffbeer703 ( 177751 ) on Monday October 03, 2005 @10:26AM (#13703872)
    What Linus is saying is that specs are dumb, and the right way to do things is to let him make or delegate all decisions, and retain the right to arbitrarily change those decisions later.

    A spec implies a commitment, and Linus is used to everything being in the air when it comes to Linux. A spec would also make it easier to fork Linux, and thereby make Linus less important.

    Linus is no idiot... this is clearly a political stance.
  • by bahamat ( 187909 ) on Monday October 03, 2005 @01:42PM (#13705800) Homepage
    There's a lot of work gone in to making Linux POSIXly correct


    But Linus also has a history of only following the spec so far as it suits him. I believe the word he used for POSIX's threading model was "stupid".
  • by oliverthered ( 187439 ) <oliverthered@hotmail. c o m> on Monday October 03, 2005 @01:44PM (#13705822) Journal
    And then after 20 years of development they realize that it is just engineering with a little artistic design.

    Why do I say that, well for one an art is something original and creative and that covers about 10% of the design of software, from then on it should be engineering.

    I've seen code from coders who don't have the correct engineering skills (that's 80%+ of developers), it's often poorly written, poorly documented, has no tests in place, and runs as slow as a dog. This is highly typical of a software development department with no dueprocess in place and where management see the softwere development process as voodoo.

    To be a bit more on topic, most of the companies I've worked at that looked at Linux dropped it because the development process is voodoo, it has no usability because the specs are always changing. Maybe if Linus had allowed a more 'stable' development cycle then there would be much more commercial take up of Linux which can only be a good thing. Some people with a hardline Linux/GPL pseudo commie stance think all software and should be free and there will be no compromises, I think that with the current user base the deeper the penetration of Linux and GPL software can only lead to a greater proportion of free software in the world and in that case specs are good.

    Now, all I have to do is find out what in the kernel has trashed my HDD twice in as many months and I'll be a happy man.
  • by rben ( 542324 ) on Monday October 03, 2005 @02:13PM (#13706081) Homepage
    But one of the most important things to do when creating one is make sure that each and every element of the specification is testable. That prevents you from wandering off into wild-eyed ivory tower land. I know, because I've been very successful at writing specications documents. On the most successful one, we had 4 bugs at the end of a three month development effort. We were on time and under budget. On the previous iteration of the system, it took three years, was over six months late, and had over 300 bugs that took three months to stomp. Granted, a good specification was not the only factor. I was working with an excellent development team and a manager who enforced good practices, such as complete code reviews for every piece of code before it was checked in.

    BTW, It's not unusual for Linus to say some pretty bizarre things. Often they are taken out of context, but sometimes, like any other human, he's flat out wrong. [ waiting for the god of programming to strike me down... ]

    Anyone who blindly accepts statements made by someone else should become a religious disciple, not a programmer. If you want to be a good coder, investigate and make up your own mind, don't let people tell you what to believe.
  • by rswail ( 410017 ) on Monday October 03, 2005 @03:24PM (#13706704)
    The world of computing is in crisis. After 40 years of 'pro' development, computing is still a human-driven craft instead of the extremely precise arm of engineering that it could so easily have become through its well-defined subject matter.

    It is precisely these sorts of statements that show how far away we are from software "engineering". Software has exactly NOT got a "well-defined subject matter". The subject matter of software is completely variable and ill-defined. The elucidation of requirements is an almost impossible task.


    The "reality" which he so seems to praise is THE PROBLEM in software engineering, and not something to be endorsed or supported.

    But the reality that Linus is stating is just that, reality. The nascent field of software engineering has not yet been able to clearly enunciate what "THE PROBLEM" is, let alone come up with something approaching a methodology and practice to solve it.

    Until the field can clearly state what the area it tackles includes and excludes, software development will remain closer to craft/architecture than engineering in its methods.

  • Re:I agree (Score:3, Interesting)

    by GlassHeart ( 579618 ) on Monday October 03, 2005 @04:11PM (#13707069) Journal
    HTML text formatting tags don't add very much to code comments, but there are many times I wished there was a good way to include even a simple drawing. A function may implement a state machine, and the easiest way to describe a state machine is frequently a state transition diagram. In some other cases, a table that I don't have to format by hand would also have been wonderful.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...