Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Operating Systems Software Linux

It's Not Time for OSS Release Cycle Synchronization 110

Bakkies Botha writes "Ars Technica weighs in with some detailed analysis on the controversial issue of open source release cycle synchronization. Ars explains how time-based release cycles work and takes a close look at how the release management strategy suggested by Ubuntu founder Mark Shuttleworth would impact open source software projects. Ars concludes that Shuttleworth's proposal isn't currently viable and argues that the BFDL is overstating the potential to simplify development with better version control tools. Ars also examines a counter-proposal offered by KDE developer Aaron Seigo and explains how it enables users to get the same benefits of synchronization without disrupting upstream development."
This discussion has been archived. No new comments can be posted.

It's Not Time for OSS Release Cycle Synchronization

Comments Filter:
  • Ars also examines a counter-proposal offered by KDE.

    Do you expect us to read the article? Or do you provide a summary of the proposal?
    • Re:Counter-proposal? (Score:5, Informative)

      by RiotingPacifist ( 1228016 ) on Wednesday May 21, 2008 @09:57AM (#23492686)
      Basically its a very long post, but the gist of what I read was why dont you just build it yourself instead of asking us to to drop nicely packaged tars on your doorstep.
      • Re: (Score:3, Insightful)

        by xenocide2 ( 231786 )
        That cuts both ways. If you change the package, you're not reflecting some crazy will of the developer or making them look bad etc. If you don't upgrade it often enough, the developer (and some users) get angry that you're still shipping old code of theirs. If you just ship upstream releases 0day, shit breaks. I just witnessed a package in universe complain that Hardy didn't ship their latest version, even though it was basically a surprise release well after FeatureFreeze and a week or so before the FinalF
        • just witnessed a package in universe complain that Hardy didn't ship their latest version, even though it was basically a surprise release well after FeatureFreeze and a week or so before the FinalFreeze. Nevermind that their first cut at the release totally broke the program and they had to release a version bump a day later. Many upstreams are terrible at release management, and distributions are valuable because they do that work

          But his suggestion gets rid of that all together, once every release cycle of the distro they take whatever code they want from the upstreams and stick it in the distro. I suppose this will annoy some power hungry upstreams, but for a lot of other projects saves them the hastle of releasing so they can get on with coding.

          • A lot of upstreams don't want to support anything but the latest release, and apparently get annoyed when people file bugs at sourceforge for stable releases. Normally they should just ignore outdated releases but more seem to be getting angry that Ubuntu is so slow to pick up changes. Fundamentally, I think many advanced users and developers hate stable release cycles and would rather see a Debian unstable process. I'm not sure how well this works for library transitions. Probably pretty bad.
  • by FudRucker ( 866063 ) on Wednesday May 21, 2008 @09:23AM (#23492248)
    I would release when it was ready, not when some stupid release cycle rolled around, that is what everyone does not need is some schedule to pressure developers to release before a product is ready...
    • by maxume ( 22995 ) on Wednesday May 21, 2008 @09:29AM (#23492328)
      The idea of the schedule is not to encourage a premature release, but to encourage a sufficiently attainable definition of "ready" such that a release eventually happens.
      • Re: (Score:2, Insightful)

        The idea of the schedule is not to encourage a premature release, but to encourage a sufficiently attainable definition of "ready" such that a release eventually happens.

        Best definition: "It'll be ready when it's ready."

        This is the same truth, whether you're talking about open or closed-source, free/libre or proprietary software.

        Trying to alter this basic truth results in death marches, bad, bug-ridden software, disaffected developers, dissatisfied users, and "we'll fix that in the next release" bull

        • Re: (Score:3, Insightful)

          by maxume ( 22995 )
          Sure, but you can at least prioritize certain features, which is then essentially a schedule, and you might as well release features once they are ready (because, as you say, they are ready).
        • by RiotingPacifist ( 1228016 ) on Wednesday May 21, 2008 @09:47AM (#23492550)
          OFC not specifying a schedule leads to e17, hurd, etc
        • Given the number of "is Linux ready for the desktop?" discussion threads, I'm pretty glad they don't follow your advice on releases.

          Depending on who you ask, a project as complicated and large as a Linux distribution release might never be ready. Hence periodic release dates, which seem to be working just fine for Ubuntu.
          • Yes, planning for releases at certain periodic intervals is a good idea. However, those dates should not be set in stone. If the planned release date is December 25, and something comes up, and the release has to be held back a couple weeks, then it should be held back. Obviously you should have a release date, with a corresponding featureset in mind, but that doesn't mean that release dates should be carved in stone.
          • Given the number of "is Linux ready for the desktop?" discussion threads, I'm pretty glad they don't follow your advice on releases.

            Depending on who you ask, a project as complicated and large as a Linux distribution release might never be ready. Hence periodic release dates, which seem to be working just fine for Ubuntu.

            Last I looked Ubuntu is based on Debian, so your example is actually a counter-example of what you want to prove. Debian [linuxplanet.com] has a philosophy of "release it when it's ready" that hasn't,

        • Re: (Score:3, Interesting)

          by mysticgoat ( 582871 )

          Trying to alter this basic truth results in death marches, bad, bug-ridden software, disaffected developers, dissatisfied users, and "we'll fix that in the next release" bullsh*t.

          If there were Godwin Awards, parent post would be a contender...

          When there is a set release date, responsible developers will keep it in mind and change plans as the freeze approaches: things that are unlikely to be finished are put off to the next release; efforts are concentrated on bullet proofing what can done. Developers that can't or won't take on this kind of responsible change of focus are going to produce crappy software no matter what (irresponsible behavior is a quality of the developer that

          • Trying to alter this basic truth results in death marches, bad, bug-ridden software, disaffected developers, dissatisfied users, and "we'll fix that in the next release" bullsh*t.

            If there were Godwin Awards, parent post would be a contender...

            When there is a set release date, responsible developers will keep it in mind and change plans as the freeze approaches: things that are unlikely to be finished are put off to the next release; efforts are concentrated on bullet proofing what can done. Developers

            • The bane of software design since the days of Fortran is that significant projects involve a hell of lot of blue-sky design. Basically, unless you are simply re-implementing something that has already been done, you can have no idea what parts are going to flow easily, and what parts are going to be total bitches. When you are doing something new and different, you cannot possibly know what is reasonable.

              Twenty years ago when I was closer to the development end of the industry, the common sense was that c

            • You are seriously confusing the oppressive nature of deadlines with the more laid back nature of open source periodic releases.

              If you fail to meet a deadline you get fired, your product gets trashed, your company gets sued etcetera.

              In the case of a distribution like Ubuntu, the only thing that happens if you miss the freeze date is that your application ships with the same features than last Ubuntu version, hardly a punishment at all.

              The question here is what do you want them to do,
              • If you fail to meet a deadline you get fired, your product gets trashed, your company gets sued etcetera.

                ... but in real life, if you fail to meet a deadline, you get to say "I told you so!", everyone else gets desperate, finally decides to allocate some of the resources you've been asked for, drop the stupid features that should have never been included in the first place, and you get back to the code face with a more doable project. After all, if they need it, they need it. They'll wait.

                Why do you th

          • When there is a set release date, responsible developers will keep it in mind and change plans as the freeze approaches: things that are unlikely to be finished are put off to the next release; efforts are concentrated on bullet proofing what can done.

            That's a very well and good phase that every project should go through, when it's ready, and not before. I understand and appreciate Ubuntu's six-month schedule, with countdowns and everything, but it does hurt certain things -- puts off major features for six months, when they might be finished in a few days, for example.

            Take Wine. Who wants to bet that there will be a 1.0 release before Ubuntu 8.10 comes out?

            Or, put another way, if I'm running a tiny solo project, I might have a stable release every few

            • I hope you mean the 2.6 kernel changes from 2.4. 2.5 was the unstable branch.
              • I mean 2.5, precisely because it was the unstable branch. The very notion of a long-term unstable branch spanning years would seem to go against the idea that we could pick any kind of synchronized release schedule that would hold for every project.
        • Re: (Score:2, Insightful)

          by firewrought ( 36952 )

          It'll be ready when it's ready.... Trying to alter this basic truth results in death marches, bad, bug-ridden software, disaffected developers, dissatisfied users, and "we'll fix that in the next release" bullsh*t.

          If you're landscaping your yard, you can take as long as you want and spend as much money as you want. If someone else is landscaping your yard, you become much more interested in how long it will take and how much it will cost: giving them your credit card and letting them keep a running tab i

          • The professional programmer enforces up-front and on-going trade-offs between features, quality, and schedule risk.

            This is one of the advantages, I think, of software-as-a-service, in any form. For a web app, we can actually add new features when they're done -- it's not as though we're going to lose money by putting them off.

            I suspect that, even for volunteer programming, schedules can be useful. Deadlines force you to think about where you want to be by the time the deadline rolls around. They give you a goal.

            And different deadlines make sense for different projects. And lockstep deadlines are harmful.

            Likewise, the lack of a schedule--especially the "It'll be ready when it's ready." attitude--can be severely damaging. If you haven't hammered out a detailed plan of what you want to accomplish and you've left it all open-ended

            Those are orthogonal.

            If I was running a large project, I'd probably do this: Feature freeze on a fixed date. Release when all outstanding bugs against the "frozen" version are resolved. If you've got a

          • It'll be ready when it's ready.... Trying to alter this basic truth results in death marches, bad, bug-ridden software, disaffected developers, dissatisfied users, and "we'll fix that in the next release" bullsh*t.

            If you're landscaping your yard, you can take as long as you want and spend as much money as you want. If someone else is landscaping your yard, you become much more interested in how long it will take and how much it will cost: giving them your credit card and letting them keep a running tab i

        • See KDE4.

          KDE 4.0 is missing roughly 80-90% of the features in KDE3. I'm told they still exist in config files somewhere... undocumented. May as well be in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying "Beware of the Leopard".

          I'm told KDE 4.1 will have all the features KDE3 did, all wrapped up in nice GUI checkboxes so us mortals can use them. Which would make it up to the standards of a dot-0 release.
        • by grumbel ( 592662 )

          Best definition: "It'll be ready when it's ready."

          Which in the case of a distribution would be never. With thousands of software projects packaged into a distribution you reach never a point where every single one of them is ready and if you don't have at least a little bit of synchronization you end up with a distribution that is bleeding edge in some areas and outdated in others. With six month release cycles this is much less of a problem of course then with Debian's endless spans of time between new releases, but it can still be really annoying if you

          • When something is "ready" is determined by the distro. They include whatever packages they think are "ready enough." For different distros, there are different values of "ready enough."

            Some want "bleeding edge", some want "stable as a rock", some want "stable w. maintenance fixes only", some want "main trunk." The idea of forcing any sort of synchronization, when no 2 distros have the exact same definition of "ready enough" is just plain stupid. I'm glad to see that others are finally giving this idea th

        • by richlv ( 778496 )
          actually, there's a quite neat approach. set some release date in future and just declare "what's ready at that point gets released, other features can be developed after that point in time".
          and i believe that's a quite nice approach, especially for stable branches.
      • The idea isn't to encourage a premature release, but that is often the effect, especially in Ubuntu. Number of times Ubuntu randomly decides to freeze on a bad version of a program without properly thinking about it...
    • by InlawBiker ( 1124825 ) on Wednesday May 21, 2008 @09:32AM (#23492388)
      There really isn't a perfect way to release Linux distributions. With timed releases components are prioritized quickly, but some stuff gets left out. With feature-based releases you have to wait until some number of components are ready so the release date is a mystery.

      I think it's great the way it is: each distro has their own method, you can pick the one that's right for you. It's the ultimate in technical Darwinism.
      • Re: (Score:2, Interesting)

        by TimSSG ( 1068536 )
        Shuttleworth's proposal makes perfect sense for the Linux Distros that just repackage the upstream work. But, it does not make sense for the Distros that do a lot of original work. Tim S
    • that is what everyone does not need is some schedule to pressure developers to release before a product is ready...

      True, not to mention that most of the developers are doing it in their own spare time, the large exception being the Linux kernel and a handful of developers in organizations such as Mozilla, IBM, RedHat, and Novell. As a user who shies from the mainstream - I use E17, Pidgin, and Audacious just to name a few apps - and uses programs that are only supported by volunteers and those who work in
    • [if I was in charge of a FOSS project] I would release when it was ready, not when some stupid release cycle rolled around, that is what everyone does not need is some schedule to pressure developers to release before a product is ready...

      When what is ready?

      Most OSS projects have a whole lot of developers working on multiple features independently of each other. There is always at least one outstanding feature someone is working on. Even when a project gets labeled 1.0 people don't usually stop working and implementing new things

      Release management is about two things:
      - When to release
      - What to release

      Basically, you can work out around what date you want to release and figure out which big features will be ready for and merge those into a ma

  • by InlawBiker ( 1124825 ) on Wednesday May 21, 2008 @09:27AM (#23492308)
    "Why don't you quite whining and help us develop and release the software you're re-packaging and trying to make money from."

    This was a good article. The Internet was actually useful today.
  • Imho (Score:3, Insightful)

    by Joseph1337 ( 1146047 ) on Wednesday May 21, 2008 @09:28AM (#23492318)
    The benefits aren`t worth it. Look at Vista and KDE4, they were released too soon and look what happened - you got half of the promised features and half of the stability
  • A lot of buzz (Score:5, Interesting)

    by bsDaemon ( 87307 ) on Wednesday May 21, 2008 @09:32AM (#23492384)
    When I first read of Shuttleworth's proposal, I figured that it might be easier to start the sync project among the community distributions which feed into the commercial ones. However, thinking further, there seems to be more issues involved and I'm starting to think that it might not be that great of an idea, or terribly important.

    the Linux-based wing of the f/oss community in particular is reaching a point where they finally have a large swath of people who are merely "end users," and whose biggest gripes aren't about some flaw in some obscure patch to imblib (for example), but are "i can't play dvds out of the box, so linux is t3h gay."

    For whatever reason, people have decided that a holy quest to "destroy Microsoft" and encourage wide-spread adoption of gnu/linux-based operating systems would be totally awesome. Ubuntu is geared at those "new recruits," with large amounts of hand-holding and media support. Mint is even better with its media support, but completely lacks dev tools if you install from the live image -- seriously, what sort of *nix system thinks you don't need a C compiler by default and makes you go looking for it in the repositories?

    Trying to sync up Red Hat or SuSE who have more or less gotten out of the consumer market and are targeting professional users - developers, engineers, etc - in the workplace environment with some candy-for-kids distro is frankly a little weird.

    The goal seems to be to increase homogeny across distributions - however, homogeny between ubuntu and rhel? quite frankly, why?

    The systems are targeted at different sets of people with different requirements and philosophies. Holding off on releasing Red Hat until Ubuntu is ready, which requires KDE and GNOME to sync up (more or less) sounds a little ridiculous and over-the-top.

    If FreeBSD were to wait until something they were trying to adopt from OpenBSD were ready, certain individuals with well known personality flaws very well might pull some sort of stunt just to make the others look bad. Given how high emotions seem to run between KDE and GNOME people, I wouldn't be surprised if one did something to spite the other, which then filtered down to Ubuntu and RH getting the shaft and looking dumb.

    The "community" is a whole lot bigger than it was 10-15 years ago, a bit colder and less friendly to boot. I have serious doubts that in the current climate this could be pulled off, even if something were to be gained by all parties -- which again, I don't think is the case anymore.

    Just my $0.02; your exchange rate my vary.
    • by jedidiah ( 1196 )
      I would like to add a "me too".

      Linux serves a number of distinct interests. Different components and distributions target
      different users. It's GOOD that there is a great degree of diversity. A little chaos is
      inevitable here. Mark's focus here should not be on some grand design. He should be focusing
      on Ubuntu. He also shouldn't be expecting anyone else to adjust to his whims.

      Perhaps he needs to be more flexible and adapt more to the upstream software maintainers (or
      help more).

      He certainly shouldn't be gettin
      • He needs to avoid the sorts of black eyes that come from not properly testing massive fundemental changes to how the desktop works.

        Which is what happens sometimes when you're feeding off the development branch of any project.

        He needs to focus on not pissing off current users and scaring away the new ones.

        And a six-month release cycle does nothing to meet either of those goals, when your repository depends so much on what's going on upstream. But OpenBSD releases every six months, too! Yes, and they contro
    • Re:A lot of buzz (Score:5, Insightful)

      by garett_spencley ( 193892 ) on Wednesday May 21, 2008 @09:48AM (#23492570) Journal
      "seriously, what sort of *nix system thinks you don't need a C compiler by default and makes you go looking for it in the repositories?"

      One that targets non-developer desktop users ? Or even servers ?

      As a sysadmin one of the many tasks I do to vanilla installs is to either uninstall the dev tools or restrict them to a particular group. Many exploits automatically download source for their rootkits or trojans etc. and compile it on the machine. Not having dev tools available to the user that the web server is running under, for example, makes these types of attacks more difficult and helps limit what an attacker can do if he does gain access (imagine a scenario where the attacker has no shell but can tell the web server to execute commands ... a simple 'wget' and 'make' later and he has himself a back door that gives him shell access as the web server user).

      In other words, if you have no pressing need for dev tools then it's wiser not to have them installed. If you're a developer then you can easily add them via the repositories.
      • Re: (Score:2, Offtopic)

        by bsDaemon ( 87307 )

        "seriously, what sort of *nix system thinks you don't need a C compiler by default and makes you go looking for it in the repositories?"

        One that targets non-developer desktop users ? Or even servers ?

        Ok for servers -- but as to non-developers, it just sort of goes back to another point - what good is *BSD, GNU/*, etc, really and practically, to those who are not interested in doing UNIX-y things. I could type documents, browse the web, and hang out on AIM just fine in Windows.

        Maybe its just because of how I got into this all. I switched to FreeBSD in the 8th grade because I wanted to do C programing. I used X so I could run several term windows and a web browser. I didn't do it because I "hated Mic

        • Re: (Score:3, Interesting)

          by MooseMuffin ( 799896 )

          I could type documents, browse the web, and hang out on AIM just fine in Windows.
          You've got that backwards. If those are the things someone uses their computer for, why would they pay for Windows?
          • You just verified his quote... "BSD is for people who love UNIX; Linux is for people who hate Windows."
            • Why is asking why someone should have to pay for an OS just because they want to do simple tasks mean they hate Windows? If nothing, I find it completely asinine his statement that someone should have to pay 200+ dollars for an OS when all they want to do is browse the web and do email. This type of snobbery where you have to become a Linux kernel hacker or you need to GTFO, needs to fucking die already.
        • Ok for servers -- but as to non-developers, it just sort of goes back to another point - what good is *BSD, GNU/*, etc, really and practically, to those who are not interested in doing UNIX-y things. I could type documents, browse the web, and hang out on AIM just fine in Windows.

          Because Linux/BSDs/Unix aren't OSes for elitists only? Why should someone be excluded because they aren't a code hacker? And people wonder why the "year of the Linux desktop" never comes when such attitudes are the ones prevailing.

          • by bsDaemon ( 87307 )
            The underlying philosophies that lead to the design and implementation of UNIX and similar systems are quite different from those that lead to the development of Windows, the original Mac, and systems like that.

            If all I wanted to do was basic, every day tasks, Win2k or XP would be more than sufficient. I wouldn't need anything else. Application availability would not be an issue.

            People complain that Photoshop and such aren't available for Linux, BSD, Solaris, whatever. But we have plenty of computer alge
            • Re: (Score:3, Insightful)

              If all I wanted to do was basic, every day tasks, Win2k or XP would be more than sufficient. I wouldn't need anything else. Application availability would not be an issue.

              But what if one wants a free OS to do all those things? Why should they have to have Windows? Why do you get so bothered over someone using your 1337 OS to do only simple tasks?

              It wasn't supposed to be "for grandma." Stallman and the FSF, with their evangelistic, holy-war approach to software may have confused the issue. "free software for everyone! information wants to be free!"

              I don't really care who you've deemed it "for". My grandma uses Ubuntu just fine to do what she needs and saved herself a few hundred dollars over having to buy Windows.

              If the reason you want grandma to run unix is because you're sick of having to clean spyware off of her system, frankly it very well may be overkill. It's like using an elephant gun to hunt a squirrel.

              No, I had it installed on the Dell machine she bought because it saved her money and it can do everything she needs.

              However, it seems to me that if people want to come to a *nix system, they should take the time to learn how and why things are the way they are. I can see no benefit from trying to make the system more like windows, because it will just cause confusion and frustration.

              Why should they have to? I've never unders

              • by bsDaemon ( 87307 )
                Its not about being "banned from using" -- its about the right tool for the job. Widnows isn't /terrible/, except maybe from an engineering standpoint. Vista may be terrible, but that's not the point. Win2k pro and XP pro are pretty unobtrusive.

                Why is it that "the right tool for the job" only seems to apply between linux distros around here. if I were saying 'use a mac,' then I might get modded up for it, too though.

                Look at it this way -- if I need to apply baseboard molding to the wall in my house, I /
                • Re: (Score:3, Insightful)

                  Its not about being "banned from using" -- its about the right tool for the job.

                  And a free OS that does everything she needs is the right tool for the job.

                  Widnows isn't /terrible/, except maybe from an engineering standpoint.

                  I never made any such pronouncements on the quality of Windows.

                  Vista may be terrible, but that's not the point. Win2k pro and XP pro are pretty unobtrusive.

                  And they also cost more money for some people than it's worth when a free OS can do everything they need.

                  Why is it that "the right tool for the job" only seems to apply between linux distros around here. if I were saying 'use a mac,' then I might get modded up for it, too though.

                  Why is it that you care what some random person uses to do what they want?

                  Look at it this way -- if I need to apply baseboard molding to the wall in my house, I /could/ use a nail gun, but a hammer would do just fine.

                  Yeah, and if all a person wants to do is browse the web and read email, why should they spend a few hundred dollars for an OS that they don't need?

                  I would whole-heartedly endorse an operating system designed from scratch to serve the needs of plain ol' users. However, trying to take a model of operation and then bend it and break it into something it wasn't meant to be under the guise of "but it /can/ be all things to all people" seems a tad misguided to me, perhaps even lazy.

                  What is being bent and broken in Linux to

                  • by bsDaemon ( 87307 )
                    I don't want to be cool or l33t. I'm not being a snob. But the crux of your argument seems to hinge on not wanting to pay for anything.

                    Most people don't build a computer, they buy it. It come with Windows on it, pre-configured to run with the hardware, and they don't mess with it any farther than that.

                    Take a look, for instance, at this [compusa.com]computer at CompUSA. Even assuming that I don't mail in the rebate, I'm not likely to come out of the store with those same parts for any substantial savings. Windows is
        • by LWATCDR ( 28044 )
          Yes there are Linux Users out there. My wife uses Linux to do digital scarp booking. There are a large number of women using GIMP to do digital scrap booking. Oh her scrap booking forums you will see them talking about how everybody should use Firefox as well... Yes women with kids that have never written a single line of code use FOSS. Some of them are now moving to Linux. The reason is they don't want Vista, they are tired of the problems they have with Windows, and Macs are too expensive. Even then som
        • so, if I don't get it, then I guess I just don't get it, but I guess it is how they say, "BSD is for people who love UNIX; Linux is for people who hate Windows."

          I agree with your sentiments, but it's worth pointing out that the small chimp that wandered into the room way back when has since grown and become 800 lb. gorilla with a proclivity for throwing chairs.

          There's social, economic and political concerns today that didn't exist back then. Those issues need to be addressed, especially given the fact that
      • "a simple 'wget' and 'make' later and he has himself a back door" And a simple 'wget' of a binary already compiled and he has his back door. Isn't that how they do it on Windows?
        • It would have to be compiled specifically for that machine, or network tools can be limited to a specific group as well.

          When I harden filesystem permissions I lock down /sbin and /usr/sbin to remove the read and executable bit from the directory and also to remove read and exec bit from all binaries that aren't required by normal users (it's really lame how many user tools end up in sbin) then I limit dev tools and network tools to specific groups. So ping, wget, ftp, ssh, telnet etc. is all restricted as w
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Trying to sync up Red Hat or SuSE who have more or less gotten out of the consumer market and are targeting professional users - developers, engineers, etc - in the workplace environment with some candy-for-kids distro is frankly a little weird.

      If some distributions use a common set of libraries and applications then why shouldn't they be interested in better synchronization with upstream development? It doesn't matter who the distribution is targeted at.

      The goal seems to be to increase homogeny across distributions - however, homogeny between ubuntu and rhel? quite frankly, why?

      Take a C compiler as an example. Are you suggesting that Ubuntu and Red Hat would want different versions of the C compiler on the basis of their users? I don't see the advantage of heterogeneity here.

      The systems are targeted at different sets of people with different requirements and philosophies. Holding off on releasing Red Hat until Ubuntu is ready, which requires KDE and GNOME to sync up (more or less) sounds a little ridiculous and over-the-top.

      Good thing no one suggested that then. If Ubuntu misses the release date, why would Red Hat wa

      • "If some distributions use a common set of libraries and applications then why shouldn't they be interested in better synchronization with upstream development? It doesn't matter who the distribution is targeted at."

        This would be perfect. If Ubuntu and Red Hat CEOs meet and decide they'll cooperate to, say, offer the same version of glibc or Gnome, so be it, that would a matter between Ubuntu and Red Hat. But this is not, and can't be, a "Linux issue" as Sutthelworth tries to make it appear.

        On the other h
    • I think that syncing release cycles make sense for RHEL and other distributions that don't release often.

      With Ubuntu, who cares? They release twice a year and have backports for things they miss.

      Also, with the Ubuntu LTS releases patches are made to deal with security issues. You don't necessarily want an application that was just released a month earlier.
      • The problem Shuttleworth is addressing is upstream packages that are out of date. They have random or at least highly unpredictable release schedules that don't seem to work well for anyone. By suggesting a specific date for releases, Shuttleworth is attempting to shift a small burden to upstream to relieve a much larger strain in Ubuntu. Once the Freezes are past, it takes significant work to backport a bugfix patch to the feature frozen package. I think Shuttleworth's case would be greatly helped if he co
      • Also, with the Ubuntu LTS releases patches are made to deal with security issues. You don't necessarily want an application that was just released a month earlier.

        Or beta software... yet Firefox 3 beta 5, both a beta and less than a month old at the time of the LTS release, is in the LTS release. Whoops!
    • Re: (Score:3, Insightful)

      by Kjella ( 173770 )

      seriously, what sort of *nix system thinks you don't need a C compiler by default and makes you go looking for it in the repositories?

      Uhh... a system that you USE? My car came with an instruction manual, not a mechanic's manual and tools. Compiling code others have compiled before is a total waste of time (sorry gentoo), I haven't got any clue why my home server, my htpc, my desktop or my laptop should possibly need a compiler unless I happen to be a developer. Particularly not the boxes I set up for my parents which they're happily using but would have as much use for a compiler as an ERP system.

      The goal seems to be to increase homogeny across distributions - however, homogeny between ubuntu and rhel? quite frankly, why?

      - Stronger competition on distro feature

    • by pjt33 ( 739471 )

      seriously, what sort of *nix system thinks you don't need a C compiler by default and makes you go looking for it in the repositories?
      I don't need a C compiler as long as the repositories are sufficiently up-to-date that the drivers work for my hardware. What sort of distro insists that you install software that you don't want or need?
      • What sort of distro insists that you install software that you don't want or need?
        Answer: any prepackaged binary distro.


        AFAIK: the only types that give "you" the end user the ability to decide what optional dependencies will be installed are source code based such as Gentoo or LFS. Otherwise you are stuck with some one else's decisions in regard to optional dependencies.

    • seriously, what sort of *nix system thinks you don't need a C compiler by default and makes you go looking for it in the repositories?
      I'm a python developer you insensitive clod!
    • by Knuckles ( 8964 )

      what sort of *nix system thinks you don't need a C compiler by default and makes you go looking for it in the repositories?
      Every security manual recommends leaving compilers off the system unless absolutely needed. And you make it sound as if "looking in the repositories" was some kind of horrendous chore.
      • by bsDaemon ( 87307 )
        When I assume something is there, because it always has been, and then its not, it pisses me off. Was adding it hard? No. Should I have had to do it? No.

        Oh well. lesson learned and applied.
        • by Knuckles ( 8964 )
          Why are you (who can change it easily) more important to cater for than other users who are better off without a compiler?
    • For whatever reason, people have decided that a holy quest to "destroy Microsoft" and encourage wide-spread adoption of gnu/linux-based operating systems would be totally awesome.

      Mostly for the reason that if this did happen, we wouldn't have to deal with Microsoft.

      Simple example: I have to know how to setup and use Samba, so I can share files with Windows. If Linux was the majority, it would be Microsoft who would have to implement a decent NFS (or ssh) client instead.

      Mint is even better with its media support, but completely lacks dev tools if you install from the live image -- seriously, what sort of *nix system thinks you don't need a C compiler by default and makes you go looking for it in the repositories?

      A smart one?

      I remember loving Gentoo because of this kind of thing -- nothing preinstalled except the bare minimum to boot, get a commandline, and run the package manager. After that, it's up to you to grab the rest

    • I think this whole silly idea is the result of there still being confusion about the difference between a "distro" and an operating system. Sure...you can get technical all you want, but by the modern/common definition of an OS today, Red Hat and Debian are two completely different operating systems that just happen to be built off alot of the same components and have some similarities. Now, something like Ubuntu could be called a distro of Debian, but never a "Linux distro" because there is no such operati
  • Either the greater collaboration would find bugs like the Debian ssh fiasco quicker,
    or every Linux distribution would be affected by the same bug.

    • That's irrelevant the debian fiasco was some developer deliberately changing some code to integrate it into debian better, distro synchronisation would happen above that level, only fixes to actual upstream code would be synchronised, not integration tweaks.
  • Comment removed based on user account deletion
  • ...then why fix it? No seriously, is there something really wrong with the way distro's are released today? Or is this just for Ubuntu to add another check to the "we invented that here" list. Plus there are the excellent points made in the above post "A lot of buzz" by bsDaemon.
  • Oh wow! (Score:2, Interesting)

    by Anonymous Coward

    A counter-proposal by Aaron Seigo? The guy behind Plasma, and the man most directly responsible for the total fucking up of the KDE 4 release cycle? KDE 4 is in shambles because of his ideas about release management. He was still adding basic features to Plasma after KDE 4.0 was already tagged a release candidate. The guy is a loon when it comes to release management. I'd rather have Ballmer dictate open-source release management than Seigo.

  • is that it will mean 3 ( or more? ) distros being released at the same time, thinning out the potential test-bed. People will either have to start looking at only one distro or provide late reviews of the new distros.

    What do they do if one isn't ready? Delay the others? If not, surely it would go 'out of sync'..?

    As someone else said. Distros will ( and should in my opinion ) do whatever they want to do, pertaining to their configuration, release cycle, whatever..
  • 2008 (Score:5, Funny)

    by hansamurai ( 907719 ) <hansamurai@gmail.com> on Wednesday May 21, 2008 @09:50AM (#23492616) Homepage Journal
    And here I thought that 2008 was the year of Linux release cycle synchronization on the desktop.
  • by HighOrbit ( 631451 ) on Wednesday May 21, 2008 @09:58AM (#23492698)
    I think syncing the major distro's would be a boon to Linux overall. It would make support easier for third party vendors and ISVs, which might induce them to release more major Linux applications. For instance, Oracle or Adobe whould know that a particular version of their product would only have to support a certain kernel (altough each distro has patches) and a certain version of Gnome and/or KDE as opposed to ten different point-releases of kernel,KDE, and Gnome. The would know which versions of the Gnu utililities they can expect to support.

    Anything that makes it easier to for major software vendors to release and support software makes Linux stronger.
    • by dbcad7 ( 771464 )
      Point releases are no big show stopper.. Firefox and Open office seem to deal with it just fine. Now it's true that they distribute software through the distro repositories... but there is know reason for major vendors not to work with the major distros and distribute through repositories as well. (And yes I am including commercial apps)

  • Way To Go Aaron (Score:5, Insightful)

    by mpapet ( 761907 ) on Wednesday May 21, 2008 @10:03AM (#23492760) Homepage
    Shuttleworth's idea is designed to further Ubuntu at the expense of the projects packaged therein. Specifically, he's trying to shift quite a bit of the release work onto the projects he packages.

    Aaron's post is a must-read for anyone vaguely interested in the topic. In particular,
    It is not overly dramatic to say that if we make Free software development overly sterile via choice of process, there will be a commensurate diminishment in participation and momentum. I interpret that as Aaron recognizing the corrosive effect on the entire dev community by adopting Shuttleworth's scheme.

    Better still, Aaron offers constructive alternatives. It's really nice to read and should be a template for most blogging.

    Someone please explain why Shuttleworth's idea hasn't been swatted down the day he posted it.

    Today's lesson: Learn to disagree without personal attacks and offer viable alternatives.
    • Yeah, Aaron did a bang up job with the release of KDE4... oh wait. Considering that major clusterfuck, I almost died laughing reading his rebuttal to Shuttleworth.
    • Re: (Score:2, Insightful)

      Shuttleworth has a done a great job with Ubuntu. I think his latest idea about synchronizing releases is a good one myself. But as always, I might not be right. The best thing this has done is we have come up with a discussion on how to make release proccesses better for the improvement of the whole Free Software Community.

      As you said, we should learn to disagree without personal attacks and offer viable alternatives. Now that Aaron Seigo has provided an alternative view, we can discuss that as well and
    • Re:Way To Go Aaron (Score:4, Insightful)

      by Eponymous Bastard ( 1143615 ) on Wednesday May 21, 2008 @01:24PM (#23495560)
      Did you even read the articles?

      Shuttleworth is saying that if the distros synchronized, upstream developers would have better information about release cycles and could chose whether to target a particular release with their new features or not (essentially, when to branch for release and focus on stabilization). If it's not ready, then it's not ready and just shoot for 6 months later.

      This guy Aaron makes a good point in that this shifts work upstream, but I don't agree that this is disruptive. Aaron's great idea? Have the distributors basically go into each and every project and make and manage the release branches themselves! Imagine someone else coming into your project and going "We're branching here because I said so". Gee, not very good with people is he?

      If the distros synchronize, upstream can just ignore it if they feel like. There isn't really much of a downside. If you do chose to synchronize you can still have features released when they are ready, but deployments (releases/tarballs) happening on schedule. It's just a matter of which branches you merge.

      On Ars' theory that big changes are prevented by a branch and merge, timed release approach, GCC has used a 3-stage (major change, improvement, stabilization) release cycle since GCC 3.1 in 2001. Rather large changes have been done since then until the 4.4 branch in development. Granted, Mark Mitchel has done a superb job at release management (i.e. cat herding) and recently had 3 more people join in in this job.

      Even Linus does this fairly often (change too big, goes in next version so we can push this one out the door)

      At best, distros could help with consulting and advising on this job, but the release planning and management must come from within each community. His point about shifting work is good, and release management for big, flaship projects could be provided by people from each company (as I'm sure redhat et al have people working on each project anyway), but big projects probably have something like it established anyway.

      I'm still not seeing the downside to synchronized but ignorable schedules downstream.
  • Release synchronization is an issue that I deal with on a daily basis a work. We have multiple huge telecom products that share common code. It's a tough problem.

    Currently, each Linux distribution has a difficult enough time coordinating the contents of their own release. They expend much effort attempting to avoid applying custom patches. They want to be as "stock" as possible and as current as possible with-out incurring some new dependency that will break another package. Meanwhile, code owners are invar
  • I can understand why you'd want synchronicity when you're working with a project as big as Ubuntu.

    But still, for the overall health of Open Source software, running asynchronous is more beneficial, as errors spotted in one major distro will urge the next major distro in line to pay attention to this particular problem.

    It'd suck big time if all major distros committed the same mistake, and would have to wait 5-6 months to correct them, giving companies like MS ample time to astroturf/FUD in their favor
  • It's hard to support a release-timing proposal from either Shuttleworth, whose company released a bug-riddled Long Term Support release, or Seigo, whose project just released a desktop environment that is almost unusable.

If all else fails, lower your standards.

Working...