Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Debian Software Linux

Is Ubuntu Getting Slower? 544

An anonymous reader writes "Phoronix has a new article where they provide Ubuntu 7.04, 7.10, 8.04, and 8.10 benchmarks and had ran many tests. In that article, when using an Intel notebook they witness major slowdowns in different areas and ask the question, Is Ubuntu getting slower? From the article: 'A number of significant kernel changes had went on between these Ubuntu Linux releases including the Completely Fair Scheduler, the SLUB allocator, tickless kernel support, etc. We had also repeated many of these tests to confirm we were not experiencing a performance fluke or other issue (even though the Phoronix Test Suite carries out each test in a completely automated and repeatable fashion) but nothing had changed. Ubuntu 7.04 was certainly the Feisty Fawn for performance, but based upon these results perhaps it would be better to call Ubuntu 7.10 the Gooey Gibbon, 8.04 the Hungover Heron, and 8.10 the Idling Ibex.'"
This discussion has been archived. No new comments can be posted.

Is Ubuntu Getting Slower?

Comments Filter:
  • What hardware? (Score:5, Insightful)

    by el_chupanegre ( 1052384 ) on Monday October 27, 2008 @07:40AM (#25525481)

    Were they testing each distribution on exactly the same hardware?

    If so, that sounds completely fair to me that it would be slower. Go and (try to) install Vista on a machine that originally came with XP (pre-SP1) and see how much slower it is. Is that a fair test either? I think not.

    As software gets more useful (and Ubuntu has, Vista not so much) it gets bigger and thus gets slower on the same hardware. Hardware advances at the same time though, so in real terms they keep about equal. When you test new software on old hardware of course it's going to be slower though.

    • Re:What hardware? (Score:5, Insightful)

      by lolocaust ( 871165 ) on Monday October 27, 2008 @07:50AM (#25525571) Journal
      Read the article, please. The exact same releases of software (such as the LAME encoder) shouldn't have a 2-3x decrease in performance.
      • Re: (Score:3, Insightful)

        by e2d2 ( 115622 )

        That would be true if software was given 100% CPU devotion. But software doesn't operate in a bubble like that and hence other needs are given CPU time, in turn slowing things like the LAME encoder down.

        It's something worth noting though, it's a real performance hit and perhaps something can be done about it in future releases.

    • Completely true. For example, the Ibex has much better support for various hardware, and comes with the software and drivers to suit. By default, beryl etc. is enabled for many graphics cards. Gnome's network manager is there too, to support the GSM connections, etc. etc.

      Apart from the fact the LTS releases mean you get security updates for years for certain older versions, there are a host of flavours explicitly aimed at low-end hardware, such as xubuntu.

    • Re:What hardware? (Score:5, Informative)

      by gbjbaanb ( 229885 ) on Monday October 27, 2008 @08:00AM (#25525689)

      When you test new software on old hardware of course it's going to be slower though.

      That's hardly a given, lots of software gets better as it ages - new features are added, but also performance tweaks get added.

      The problem is that software should be getting quicker on the same hardware, the alternative is bloaty apps that no-one wants to use. See Vista for the ultimate conclusion to that. You don;t want Ubuntu to end up the same, so its good that someone is pointing out performance issues. Hopefully the next release will have a few of these issues looked at and improved.

    • Re:What hardware? (Score:5, Insightful)

      by blind biker ( 1066130 ) on Monday October 27, 2008 @08:07AM (#25525739) Journal

      I heard this argument a wee too often. Maybe software should be more useful while at the same time NOT getting slower? Maybe that would be a good thing, as it would then run well on netbooks as well, what do you think?

    • Re:What hardware? (Score:5, Insightful)

      by RAMMS+EIN ( 578166 ) on Monday October 27, 2008 @08:08AM (#25525751) Homepage Journal

      You make it sound like it is inevitable and acceptable that newer software is slower than older software. I disagree. For one thing, one way to improve software is to make it faster. This is actually done sometimes. Secondly, even if you add features to software (which is another way to improve software), that doesn't have to make the software slower. In some cases, this may be inevitable, but in many cases it is not.

      I personally see computers, and software, as tools for making life more efficient. When software becomes slower, efficiency is actually lost. When this isn't offset by providing me with a more efficient work flow, I lose efficiency. Since efficiency is the main reason I use computers in the first place, this is a big deal.

      • Re: (Score:3, Insightful)

        by Hoi Polloi ( 522990 )

        That assumption, that software gets slower as newer versions come out, is the "Windows Effect". People have grown up with Windows bloatware and assume that behavior is the norm. They are more concerned with adding new bells and whistles and not revisting existing code. When I work on software releases one of the main things we do is not only add new functionality but improve the performance of existing code, especially by taking advantage of new hardware/db tech features.

    • Re: (Score:3, Insightful)

      Well, so much for the "it runs great on old hardware" argument. A million lonely blogger's "Top 10 reasons to switch to Ubuntu" just became top 9 lists.
  • "That's quick" (Score:2, Informative)

    by G3ckoG33k ( 647276 )

    "That's quick" was the phrase my girlfriend after an update of Debian Sid to include KDE 4.1 and OpenOffice.org 3.0 from Experimental. "Wish my slow machine at work was this quick".

    You don't have to guess what OS she is using there...

    Anyhow, once you replace 3.5.x with KDE 4.1 you will notice a difference. At least I did. (No, I didn't read the article first... Bad boy.)

    • by iworm ( 132527 ) on Monday October 27, 2008 @07:49AM (#25525559)

      How many geeks have heard such phrases: " "That's quick" was the phrase my girlfriend after..."

      Alas.

    • Agreed, from my gut level appraisal I thought KDE4.1 was a lot faster - but I returned to 3.5 because it seemed more polished, completed and well rounded. Little things like how items were implemented, widgets all seemed to lack some finesse.

      Having said that in another few months I will definitely make the switch - because I really liked what I saw and the direction it was heading in.

      You expect more functionality to run slightly slower - that is if you assume the original implementation was part way o
  • As we add complexity and layers of abstraction things tend to slow down in general. If hardware keeps up, and actual human productivity increases, do we have an issue?

    I'm all for lean and mean, but it's quite possible to optimize a distro for speed as well. Ubuntu getting slower is not a good thing, but slower is better than harder to use. Netbook distros can be optimized for the hardware in question, after all...

    It would be interesting to see how these tests perform across distros, or with a kernel optimiz

    • by siride ( 974284 )
      They haven't added any layers of abstraction over the past few releases. That isn't the problem anyways. I run Gentoo with the same bits of software, but it's not slow at all. And I'm not just talking about boot time. I mean, the UI is actually snappy and programs start up quickly. I don't know what Ubuntu does to make it so shitty, but they have done a really good job of it. Fedora on our workstations at work also seems to have suffered some severe performance regressions over the past few releases.
    • by samkass ( 174571 )

      I'm glad someone made the point. It really is irrelevant how fast the OS performs microbenchmarks. What matters is how fast the user gets things done. If you spend all day encoding MP3s so be it. But for a lot of people, a kernel that's half as fast but makes some complex things simple is the way to go.

      Anyway, that's Apple's philosophy, and why you see Apple not caring so much about kernel benchmarks. That being said, every version of MacOS X has been faster than the last one the same hardware despite

    • Re: (Score:3, Insightful)

      by RAMMS+EIN ( 578166 )

      ``As we add complexity and layers of abstraction things tend to slow down in general. If hardware keeps up, and actual human productivity increases, do we have an issue?''

      You have that exactly right. Software getting slower is bad, but it's ok if it is offset by other changes, such as faster hardware or new, more efficient ways to perform tasks. In the end, it's our productivity that counts. Now the real question is, how do we measure that, how has it developed over time, and what changes have had the great

  • xubuntu (Score:3, Insightful)

    by kisak ( 524062 ) on Monday October 27, 2008 @07:46AM (#25525535) Homepage Journal
    Would have been interesting to have the same benchmarks for Xubuntu, since that is the distribution that is targeted for computers where performens increase/decrease is very noticable.
    • Re:xubuntu (Score:5, Informative)

      by Mr.Ned ( 79679 ) on Monday October 27, 2008 @08:15AM (#25525823)

      Xubuntu's performance targeting appears limited to choice of desktop environment, which was a small component of what these benchmarks tested. The big performance increases the article talks about were in databases, compilers, encryption, memory access, and audio/video encoding/decoding, none of which really have much to do with the desktop environment.

  • Real men (Score:3, Funny)

    by Anonymous Coward on Monday October 27, 2008 @07:48AM (#25525551)
    Real men don't upgrade their OS for exactly this reason. In fact, we don't even use OSes. To get maximum performance we write all operations directly to RAM in machine code, while the machine is running, using a needle and a car battery.
  • Some of theses tests such as the SQLite test I am wondering if they used SQLite within ubuntu or they build and run it on the system they were testing.

    This matters because Ubuntu comes with different versions of SQLite.

    However if there is a problem then I hope they report it on launchpad. I have noticed any slowness myself.

  • Security Patching? (Score:5, Informative)

    by Ironsides ( 739422 ) on Monday October 27, 2008 @07:51AM (#25525585) Homepage Journal
    Ok, the article completely ignores this (as do most of the above posters it appears). Each version of Ubuntu tested seemed to have different kernel builds. How much of the slowdown is due to the kernel being patched for security and bugs and how much is due to the software that has been added?
  • by Peter Desnoyers ( 11115 ) on Monday October 27, 2008 @07:52AM (#25525597) Homepage

    If you look closely you'll notice that (a) the benchmarks were run on a Thinkpad T60 laptop, and (b) there were significant differences on some benchmarks like RAM bandwidth that should have little or no OS components.

    This sounds to me like the power management was dialing down the CPU on the later releases...

    • by trumplestone ( 820102 ) on Monday October 27, 2008 @08:15AM (#25525829)

      Mod parent up.

      Many of the benchmarks (such as the lame, Java and RAM bandwidth benchmarks) are CPU-bound, and will run the majority of time in userspace. As the kernel should only be invoked for timer ticks, interrupts, TLB misses, etc (which would probably account for less than 1% of the CPU time), and change to the kernel should have minimal impact on the benchmarks.

      The parent's comment that power settings have been misconfigured sounds spot-on.

    • Re: (Score:3, Informative)

      by Cthefuture ( 665326 )

      Yes! In particular check the "ondemand" CPU scaler. That thing just doesn't work very well. It takes too long to trigger the higher clock speed and if you have multiple CPU's and/or are running lots of quick processes then the clock will constantly be shifting between speeds. This totally kills the performance.

      I turned off the CPU scaling on my Ubuntu workstation and I disable it on my laptop when I need maximum performance.

      This can be fixed with two changes to the ondemand profile. First it should bum

      • Re: (Score:3, Informative)

        by Bert64 ( 520050 )

        Most CPUs cant allow cores to run at independent speeds...
        On the other hand, AMD quad cores do, and i'm glad to have one core running full speed processing a single threaded program, and the other 3 cores as slow as possible to handle the background OS tasks..

  • by Scholasticus ( 567646 ) on Monday October 27, 2008 @07:54AM (#25525611) Journal
    "Ubuntu" -- an African word, meaning "Slackware is too hard for me".
    • Re: (Score:2, Funny)

      by Anonymous Coward

      "Slackware" -- an African word, meaning "Gentoo is too hard for me".

    • Re:Ubuntu? No way. (Score:5, Insightful)

      by thatskinnyguy ( 1129515 ) on Monday October 27, 2008 @08:16AM (#25525857)

      "Ubuntu" -- an African word, meaning "I'm sick of fucking with Linux in order to get it to do what I want but I really don't like the alternatives."

      Yeah, I rocked Gentoo for a couple of years. I just want something that is fast, easy to use and gives me as little of a headache as possible. Linux is Linux and most of the knowledge learned in one distro will carry-over to another.

      • Re:Ubuntu? No way. (Score:5, Insightful)

        by FrozenFOXX ( 1048276 ) on Monday October 27, 2008 @08:48AM (#25526391)
        Mod the parent up, I'm in the exact same boat. Ran Gentoo for three years, loved it, but realized that I was spending a lot more time tinkering with the OS than actually USING the OS. That's perfectly fine for a server of some variety where really the tinkering IS the interaction, but for a desktop or something you're going to use more interactively on a daily basis it became too much of a pain.

        Ubuntu gives me some of the strengths I liked (such as a simple, straightforward package manager, wide amount of customization without too much screwing around) without too many of the weaknesses (compiling all software, praying emerging the world doesn't break my desktop, so on and so forth).

        It's not a bad distro at all and it's tiring to hear of people slamming it for not being Slackware or Gentoo. This may come as a revelation, but Linux is about choice.
        • Re:Ubuntu? No way. (Score:5, Insightful)

          by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Monday October 27, 2008 @09:39AM (#25527167) Journal

          I was pretty much exactly the same.

          Turning point for me was realizing that I was compiling more and more in, just in case I needed it, because rebuilding world just to enable that new USE flag was getting kind of old.

          In other words, I was using it like Ubuntu. The only advantage I had was I would compile for -march=i686, and other optimizations which produce binaries which only work on recent CPUs (the '686' class) -- whereas Ubuntu was -mtune=i686, if I remember, so it was possible to run on a 486, but would run best on a 686.

          And, hey, there were other things I would turn on that were Athlon XP specific, and so on... then I realized that, on amd64, the optimizations were basically exactly the same -- merely compiling for x86_64 gave me all the benefits anyway. At which point, what the hell -- Ubuntu would necessarily be at least as optimized as my Gentoo.

          And, more recently, I've realized that since switching to Ubuntu, I spend much more time actually using the OS, rather than tweaking it. Despite having it already much more customized than any version of Windows ever was, I still don't spend as much time tweaking it as it takes to maintain Windows, let alone Gentoo.

    • Re: (Score:3, Funny)

      by elrous0 ( 869638 ) *
      "Linux"--the Finnish word for "Many confusing forks"
  • by mlwmohawk ( 801821 ) on Monday October 27, 2008 @07:54AM (#25525617)

    Some of the benchmarks were hardware testing, and those showed variation. They should not, unless the compiler changed the algorithms used to compile the code between distros.

    Benchmarking a multi-tasking system like Linux is a tough thing to quantify. The Linux kernel recently had a big scheduler change, this alone could account for shifting benhmark numbers. It may not actually "slowing down," but running multiple programs more evenly. The effective work is the same or better, which would mean "faster," but an almost useless benchmark would look slower.

  • I may be a bit biased here. Full disclosure: I am an Ubuntu Member, User, Abuser. I think that Ubuntu is one hell of a Distro, and GNU/Linux is one hell of an OS. Ubuntu, however is not geared for the market where we squeeze every CPU cycle we can. For that you have to do a _LOT_ of cleaning, replace your kernel to something a bit more fit for a server environment. Ubuntu is, and will always be a distro that is Easy to use first, even if that comes at the expense of some kruft. Each distro is becoming m
  • Yes, absolutely! (Score:2, Informative)

    by Phreakiture ( 547094 )

    Yes, Ubuntu is getting slower, absolutely, without question on my part.

    My single biggest complaint against 8.04 was that it could not get out of its own way to play an MP3 on my somewhat modest hardware (Via MII-12000). It runs fine on my wife's machine, however (AMD Sempron on Via MoBo).

    Now, it is possible that the slowdown is only with 32-bit versions. My wife's machine is running the 64-bit version, and seems to run pretty well. In the mean time, I have reverted to Slackware, which has always been my

  • Don't tell me Ubuntu is getting slower. On my eee PC with a stock hardware configuration, I go from a cold boot to the desktop in around 40 seconds whereas with XP it was taking about 3 minutes on average. Granted the sub-distro (if you would call it that) that I am using is optimized for my setup.
  • by Anonymous Coward on Monday October 27, 2008 @08:16AM (#25525841)

    ... they are.

    Seriously.

    I can see several problems with the testing methodology as is:

        * The test suite itself: The Phoronix test suite runs on PHP. That in itself is a problem-- the slowdowns measured could most likely be *because* of differences in the distributed PHP runtimes. You can't just say "hey, version Y of distro X is slower than version Z! LOLZ" because, WTF. You're pretty much also running different versions of the *test suite* itself (since you have to consider the runtime as part of the test suite). Unless you remove that dependency, then sorry, you can't measure things reliably. Which brings me to my second point...

        * What exactly are they testing? The whole distro? The compiler (since most of the whole of each distro version is compiled with different versions of GCC)? The kernel? If they're testing the released kernel, then they should run static binaries that *test* the above, comparing kernel differences. If they're testing the compiler, then they should build the *same* compiled code on each version and run said compiled code (which is pretty much what I gather they're doing). If they're testing the utilities and apps that came with the distro, then they should have shell scripts and other tools (which run on a single runtime, not depending on the runtime(s) that came with the distro version). Because if you don't, you have no fucking clue what you're testing.

    Honestly, I was unimpressed by the benchmarks. I happen to do performance benchmarking as part of my job, and I can tell you, you have to eliminate all the variables first -- isolate things to be able to say "X is slow". If you rely on a PHP runtime, use *exactly* the same PHP runtime for all your testing; otherwise, you'll get misleading results.

  • by siride ( 974284 ) on Monday October 27, 2008 @08:27AM (#25526029)

    The GTK+ statistics are mind-boggling slow. That's what I notice most when I use Ubuntu or Fedora. On my non-Ubuntu laptop, I get the following results for GTK performance:

    GtkDrawingArea - Pixbufs: 3.73s (on mine) vs 43-55s (Ubuntu)
    GtkRadioButton: 13s vs 29-60s

    I just think that's ridiculous. What did they do to GTK+ to make it so slow?

    • Xorg, mainly. (Score:4, Informative)

      by ciroknight ( 601098 ) on Monday October 27, 2008 @08:55AM (#25526499)
      Namely, graphics hardware. ATi graphics hardware and the FGLRX driver. FGLRX is known to have crappy 2D performance relative to its very strong 3D performance and the 2d performance that the open source driver excels at.

      Meanwhile, 2D performance on Intel's hardware is smoking everyone else's pipe.
  • Comment removed (Score:4, Interesting)

    by account_deleted ( 4530225 ) on Monday October 27, 2008 @08:35AM (#25526173)
    Comment removed based on user account deletion
  • by Blice ( 1208832 ) <Lifes@Alrig.ht> on Monday October 27, 2008 @08:48AM (#25526385)
    Ubuntu isn't really suited for low end machines anymore, IMO. Has to be at least 2ghz, with at least 512mb RAM.. Anything lower and it's going to be pretty slow.

    I'm sure some people here are going to be comparing Ubuntu to Vista in regards of getting slower with release, but because it's Linux we have a few more choices.

    They make distros that are meant to be lightweight- Anyone with a machine that's a little old, I urge you to try one of them. You'll be pleasantly surprised and maybe find a new favorite window manager/desktop environment in the process. Before you start talking about how joe sixpack doesn't want to try another distro or learn anything about Linux- I'm not talking to Joe sixpack here, I'm talking to you, a slashdot lurker.

    Try one of these distros and be amazed at how fast everything is:

    Crunchbang Linux [crunchbang.org]

    KMandla's GTK 1.5 Remix [wordpress.com]

    Or, if you want to be more adventurous, get Arch Linux [archlinux.org] and grab a window manager like Openbox or PekWM. If you go that route, take a look at this Openbox guide that'll show you a nice panel to use, file navigator, and generally hold your hand through the process, here [wordpress.com]. But if you want your hand held even more, someone packaged a panel and file navigator and theme chooser and stuff like that together with Openbox already- Called LXDE. You can just grab that too, should be in any repository.

    I do think it's unfortunate for joe sixpack that it's getting a little slower- But for them it's still faster than Vista and XP, right?

    You know what they should make? They compile pretty much everything in the kernel as a module, and then they probe hardware and load the right modules each time you boot... It would be cool to be able to do a "Speed up my computer" boot where it loads the modules, and then compiles a kernel with the modules for the hardware it finds compiled into it. Disable things that it hasn't seen their computer use, etc., and then just still probe the hardware to fall back on another kernel if things have changed.

    OR, how about loading modules when you actually need them..? And this goes for daemons, too. When you go to listen to something, and it returns that there's no module loaded for sound, how about loading the module then, and then starting the alsa daemon. Have you ever looked at the daemon list for Ubuntu? It's huge. I know I don't need all of those- I know because on the distro I'm on now I only run 3 daemons on boot, and everything works fine.

    I don't know. Maybe that's not the solution. But those guys are clever, I'm sure they can come up with something to get rid of the extra daemons and modules running without sacrificing usability. Anyone here have any good ideas..?
  • by calc ( 1463 ) on Monday October 27, 2008 @09:17AM (#25526829)

    I believe that PIE (position independent executable) along with some other security enhancements were turned on in Ubuntu around the time the slowndowns showed up. This would definitely cause at least some slowdown on the 32bit version since there aren't enough registers to begin with. I'm not sure if it causes any noticeable slowdown on the 64bit version, since the amd64 architecture has a lot more available registers, which would correlate with the person mentioning earlier that the 64bit version seemed fast.

news: gotcha

Working...