Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Linux Software

Linus And Alan Settle On A New VM System 167

stylewagon writes: "ZDNet are reporting that Linus Torvalds and Alan Cox have finally agreed on which Virtual Memory manager to include in future kernel releases. Both have agreed to use the newer VM, written by Andrea Arcangeli, from kernel version 2.4.10 onwards. Read more in the article."
This discussion has been archived. No new comments can be posted.

Linus And Alan Settle On A New VM System

Comments Filter:
  • good news (Score:2, Insightful)

    by Psychopax ( 525557 ) on Wednesday November 07, 2001 @09:54AM (#2532304)
    Though I do not think that Linux would have been doomed just because there were two different versions of the kernel out there it would have been probably more difficult for Linux.
    So that's good news. J.
  • by mark_lybarger ( 199098 ) on Wednesday November 07, 2001 @09:59AM (#2532327)

    from the article- The accord also ends speculation that a fragmented Linux community would be doomed in the face of Windows.


    where does this ludicrious speculation come from? this sort of reporting of unsubstanciated claims is quite funny on the surface. but the more general audience reading this article will think MUCH less of the stability of the linux kernel reading crap like this. sure there is/were two different VM systems that has caused lots of posting here on ./ and probably much discussion on the kernel mailing list. how in the hell does that indicate that the linux community will be doomed in the face of windows? ARRRGGGGG!

  • by sphealey ( 2855 ) on Wednesday November 07, 2001 @10:01AM (#2532333)
    Was there really any dispute between ac and Linus, or was it just a technical competition to see which system could be pushed the farthest?

    I thought the eWeek article took an unnecessarily confrontational tone.

    sPh
  • Re:NUMA?! (Score:4, Insightful)

    by Prop ( 4645 ) on Wednesday November 07, 2001 @10:04AM (#2532348) Homepage
    I have to ask ... shouldn't a NUMA-efficient VM be left as a patch, or thru a kernel fork ? I mean, how many people have access to NUMA machines, let alone own one ?

    The VM in the mainstream kernel should be optimized for what Linux runs on 99% of the time : single CPU, with a "standard" memory bus.

    With that being said, I couldn't believe that Linus made such a major change in a stable kernel. I'm glad it works, and that Alan Cox has agreed to go with it, but it wasn't an example of software engineering at its finest...
  • by Duncan Cragg ( 209425 ) on Wednesday November 07, 2001 @10:13AM (#2532378)
    1. Everyone knew the Rik VM was poor
    2. Linus was stressed about it and took a brave decision to go with Andrea's VM
    3. It was VERY late to be doing this, but necessary.
    4. Linus' decision was correct as it turns out.
    5. Alan's decision was also correct in that you shouldn't be doing this kind of dramatic about-face in a 'stable' kernel.
    6. Alan's going with Andrea was also correct.
    7. I've been waiting, along with many others, for this whole mess to be sorted before 2.5 was started and I upgraded kernels.
    8. Passing 2.4 over to Alan means we can upgrade in confidence. This should be the test of stability for 2.6: upgrade when Linus passes it on to Alan.
  • by gaj ( 1933 ) on Wednesday November 07, 2001 @10:13AM (#2532381) Homepage Journal
    This just further illustrates the strength of an open development process. There was a problem, and that fact was discussed openly and pointedly. That scares many people. I don't get why. It's code, not a person. It doesn't look like Rik is taking it very hard, at least as far as his posting on lkml shows.

    I like to think of Linux development as sort of a modified IETF style: rough consensus and running code, with a sprinkle of holy penguin pee when Linus thinks it's ready to ship. Linus saw a problem, had a solution presented to him, and just went for it. Alan thought it was a bit insane to switch horses in midstream. I would normally agree with Alan; better to try to get the horse you're on to do the job than try to jump to another one. Worry about getting a newer, better horse once you're safely on the other bank.

    Given the time frame for 2.5/2.6, though, and given the seriousness of the VM issues, I can see why Linus decided to take the risk. Apparently so does Alan. I'm kindof anal about release numbers, so I'd probably have started a 2.5 branch to test the new VM in, and refused any other changes, then released 2.6 with the new VM. That fundamental a change should probably get a point increase in version number.

    Regardless, the short version is that this is much to do about nothing. The rest of the industry just isn't used to seeing this sort of thing happen in plain view. It normally happens behind the scenes, with a carefully scripted spin put on it by marketing. Maybe if they see the process work enough times people will become comfortable with it. I doubt it.

  • by Kefaa ( 76147 ) on Wednesday November 07, 2001 @10:19AM (#2532399)
    We seem to take things too personally here. Alan and Linus had a disagreement about when and why. Much like people I work with on a daily basis have differences of opinion on approach. In the end we do not start working for other companies, we reach an agreement.

    ZDnet is not the ACM; they are trying to sell magazines (or at least sponsors). A little conflict spices up the story. Should they put a more reasonable context around these things? Sure. However, if they did : "Linus and Alan agree on future" is hardly news worthy.

    The more people hear about LINUX the better. (positive spin coming...)

    In this context people can believe we know how to operate as open source and an effective business model. The need to evaluate, compare and when necessary compromise can be accomplished in this model for the benefit of everyone. People who appreciate that the people we want to be making business decisions for Linux.
  • by darkov ( 261309 ) on Wednesday November 07, 2001 @10:19AM (#2532401)
    I think it's unfair to characterise robust debate as fracturing or a lack of unity in the Linux comunity. Isn't it normal for people to disagree on things? It may look like disunity to your average joe, but the fact is that corporations very carefully control what the media knows and what discussions go on behind closed doors. I'm sure everyday people in companies all over the world not only argue till they are blue in the face, but also undermine each other's authority, turn coworkers against their opponent and other nasty political bullshit.

    Open software has an open process. That is a strength. Suggesting that just because there is disagreement in the Linux community means that it is less co-operative or cohesive than Microsoft or anyone else is utter crap. Open debate and having your own opinions are healthy signs, much better than some coerced worker toeing the company line, idependent of what is technically best.
  • Settlement? (Score:2, Insightful)

    by utdpenguin ( 413984 ) <[moc.kcirdnek] [ta] [nhoj]> on Wednesday November 07, 2001 @10:29AM (#2532442) Homepage
    Maybe I am conpletely erroneous here, but I dont think so. :)

    But, is this really soemthign that cna be defined as "settling." As I understand it (correct me if I'm wrong) Linus put the new VM into his kernel. Its been there ever since. And its not going away. Rather Cox is giving in the Linux, as he should, since Linus is in charge. This isnt settlement, its the natural course of development. A change is proposed, Linus oks it and impelments it. Everyone else follows suit sooner or later.

    I understnad the potential horror of a kernel split, but does anyoner eally ebelieve that was going to happen? Im betting Cox would rather use a far inferior VM than allow a total split, simply because of the magnitude of suhc an action.

  • by MeerCat ( 5914 ) on Wednesday November 07, 2001 @10:42AM (#2532497) Homepage
    ... honestly not meant to be a troll, but does anyone else find it strange that slashdot is reporting a ZDnet story about news re:the Linux kernel development ??

    Have I missed something here ? I used to work in fraud investigation and there we have a dual scale of trusting information

    - how trustworthy is this source ?
    - how trustworthy is this source with regards to this type of information ?

    (e.g. The Queen as a news source is considered trustworthy, but if The Queen told me the local 7-11 was going to be robbed at 11:30 tonight then I'd doubt the information).

    Maybe that Jesse bloke really does know what he's talking about...

    T
  • Re:NUMA?! (Score:2, Insightful)

    by Ed Avis ( 5917 ) <ed@membled.com> on Wednesday November 07, 2001 @10:55AM (#2532544) Homepage
    I'm currently wondering about NUMA - or something close to it. I'm running Linux on a couple of machines where the memory is of differing speeds: a fast eight megabytes and then the rest of the RAM is a lot slower. Can existing Linux kernels handle that sensibly?
  • Re:NUMA?! (Score:2, Insightful)

    by Anonymous Coward on Wednesday November 07, 2001 @11:11AM (#2532602)

    but it wasn't an example of software engineering at its finest...

    In the strictest sense... you are correct. However, engineering of any sort is a real world activity, not some dry academic subject (Wirth is full of shit on many topics, for example). Knowing when it's time to give up on a bad job, chuck it out and give something else a chance is a valuable thing (but not something to do lightly, or often).

  • Holy Penguin Pee (Score:2, Insightful)

    by Bonker ( 243350 ) on Wednesday November 07, 2001 @12:26PM (#2532958)
    I like to think of Linux development as sort of a modified IETF style: rough consensus and running code, with a sprinkle of holy penguin pee when Linus thinks it's ready to ship.


    This is perhaps the most beautiful description of the process I have ever heard.

    I agree with you. People are used to dealing with a companies like MS, Apple, and Oracle, who are built from the ground up to never admit deficiency or the need for change even though that is a crucial aspect of any kind of upgrade cycle.

    When a group of firebrands come around that can freely admit deficiency, it does cause some waves.
  • by TheLostOne ( 445114 ) on Wednesday November 07, 2001 @04:49PM (#2534322)
    Really.. why is everybody getting their panties all in a bunch here? Two kernels? My big toe! Making Linux look fragmented?! WHAT?!

    No I'm not a moron.. heh.. well maybe.. but I have two points..

    1- There are not '2 kernels'.. this is crap.. there is ONE linux kernel (currently at 2.4.14.. which is development anyway.. but thats for another post ;) and a couple of different development kernels that most 'normal users' (ie not kernel hackers) will never touch. I think we can agree that those adventurous users who dwelve into pre and ac patches at least know enough to know they are called pre/ac for a reason. And yes.. I know there were 2 kinds of kernels.. at a sort of esoteric level but when we say 'the Linux Kernel' in the context of a normal running system doesn't that rather imply the stable branch of it? How many distros ship with ac? In the default... not in 'super expert mode'? Too many distros to say none.. but I've never seen one..

    2- How is this going to ever possibly give the impression that linux is too fragmented compared to anything?! I figure (and I know this is a grevious over simplification) that there are basically two kinds of users.. those who know and those who don't. Those who know (IBM, Compaq.. those companies we want so bad) will know enough about the story to realize this is a disagreement in timing if anything and no big deal... Those who don't haven't heard about this anyway.

    So other than the kernel developers needing to run both.. what's the problem?
  • by 4of12 ( 97621 ) on Wednesday November 07, 2001 @07:41PM (#2535252) Homepage Journal

    but it may have simply been a bridge too far

    I think so. I always got the impression that Rik is an extremely intelligent programmer with not enough time on his hands to do the enormous job of VM writer for all of Linux.

    Which was important, because it seemed, too, like he was one of a handful of people in the world that really understood how his VM system worked and, more importantly, was the ONLY one in the world that understood what needed to be done to it get it to work right.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...