Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Open Source Operating Systems Red Hat Software Ubuntu Linux

Ubuntu Isn't Becoming Less Open, Says Shuttleworth 98

sfcrazy writes "While the larger Ubuntu community was busy downloading, installing and enjoying the latest edition of Ubuntu yesterday, a post by Ubuntu founder Mark Shuttleworth ruffled some feathers. He gave the impression that from now on only select members of the community will be involved in some development and it will be announced publicly only after completion. There was some criticism of this move, and Shuttleworth responded that they are actually opening up projects being developed internally by Canonical employees instead of closing currently open projects. He also made a new blog post clarifying his previous comments: 'What I offered to do, yesterday, spontaneously, is to invite members of the community in to the things we are working on as personal projects, before we are ready to share them. This would mean that there was even less of Ubuntu that was NOT shaped and polished by folk other than Canonical – a move that one would think would be well received. This would make Canonical even more transparent.'"
This discussion has been archived. No new comments can be posted.

Ubuntu Isn't Becoming Less Open, Says Shuttleworth

Comments Filter:
  • by MichaelSmith ( 789609 ) on Friday October 19, 2012 @05:48PM (#41710905) Homepage Journal

    I upgraded to 12.10 last night and spent the morning with a non-functinal system. Disabling my externa monitor has stopped the UI from hanging. At the moment it looks like the window manager (or what passes for one these days) can't cope with multiple monitors, at least configured the way I use them (laptop with a large external monitor, laptop monitor configured to be geometrically below the external montitor). I noticed that windows on the laptop screen go into this mode where the window border pulses, as if something in the window manager is thrashing.

    • by MrEricSir ( 398214 ) on Friday October 19, 2012 @05:55PM (#41710961) Homepage

      Are you using a proprietary video driver? I've had much better luck using the open source drivers with dual monitors on Ubuntu.

      (And yes, that goes for both Unity and Gnome 3.)

      • Graphics driver says "INTEL IGD X86 MMX SE2" so yes I suppose I am using the proprietary video driver. I had a quick look around the system but I can't find the app for choosing proprietary drivers. I will look into an alternative. I just noticed that when an application "dims" to indicate that it is running slow, the window border flashes in the way I described above. Maybe that was a different issue but it doesn't look good.

        Thanks for the suggestion.

        • by Anonymous Coward

          Intel's drivers are open-source.

          Try using KDE, rather than Gnome?

          • Or just go back to 12.04

            • I'm still running 10.10 because I can't upgrade to 12.04 until it stops randomly crashing; [from memory] I'm testing the nVidia 295.40 driver and that's still getting the odd crash (as in X crash... and I get logged out).

              Am I just unfortunate in this? I've got a 9800GT video card btw.
              • Geforce 8/9 GPUs are dying, netcraft confirms it. Well, they suffer from a hardware manufacturing defect that may manifests after years, I sure had a few crashes with my 8400GS, first when testing OpenArena (this is a game that is worth for testing purpose but is so inferior to Quake 3 it's not worth playing). Hard to tell if just the driver crashed or it's because of the hardware.

                • Naah... I'm running this hardware with Ubuntu 10.10 (and WinXP for games) and it's fine. I'm testing 12.04LTS with an external drive, but even with fairly limited use, it crashes and when I look in the Xorg log, it'll be an nvidia driver that caused it.
          • by Anonymous Coward

            Yeah that's the great thing about the free desktop experience a.k.a The Linux Desktop Shuffle. There's a seemingly endless supply of half-assed perpetually 80% done bug-ridden bits of desktop infrastructure you can sift through whenever you have a problem, until you find a combination that offers the best comprimise between something that kinda-sorta-works well enough and something you actually want to use.

            • by Anonymous Coward

              You just described every piece of proprietary software on the planet. Or at least anything with a price tag over $400.

            • Yeah that's the great thing about the free desktop experience a.k.a The Linux Desktop Shuffle. There's a seemingly endless supply of half-assed perpetually 80% done bug-ridden bits of desktop infrastructure you can sift through whenever you have a problem, until you find a combination that offers the best comprimise between something that kinda-sorta-works well enough and something you actually want to use.

              I think you should get back to arguing which versions of Windows are the good ones, or why first versions of windows are always rubbish, or discuss how awful Mac OS was before BSD came to the rescue.

              The truth is Linux has a choice of mature desktop environments, bug free, and feature complete. Linux may have problems, but your lie isn't one of them.

        • The only graphics drivers on Linux for Intel chips are open source, so if you have an Intel GPU, you can't be using proprietary drivers.

      • Sadly, ATI decided to stop support in its closed-source driver for the FirePro M7740 chip, which Dell sold me in a "workstation-class" laptop less than three years ago. So I'm already using an open-source driver despite its inferior performance.

        But unfortunately, even the open-source driver (or some other part of X or the kernel) leaves the display goofed up if I suspend/resume. If I'm able to somehow get to a terminal, I can run "killall gnome-session", which seems to do the trick. But long-story short,

        • by MrEricSir ( 398214 ) on Friday October 19, 2012 @06:32PM (#41711233) Homepage

          Sadly, ATI decided to stop support in its closed-source driver for the FirePro M7740 chip, which Dell sold me in a "workstation-class" laptop less than three years ago.

          As someone who's been in the same boat, I don't think it's fair to blame the manufacturer here. Your hardware didn't change -- your software did.

          Blame whoever broke binary compatibility with the existing driver.

          • Sadly, ATI decided to stop support in its closed-source driver for the FirePro M7740 chip, which Dell sold me in a "workstation-class" laptop less than three years ago.

            As someone who's been in the same boat, I don't think it's fair to blame the manufacturer here. Your hardware didn't change -- your software did.

            Blame whoever broke binary compatibility with the existing driver.

            Dell advertised the M6500 laptop (my laptop with the FirePro M7740) chip as having Linux as a "supported" operating system. I realize they made a vague claim in that statement, but when a laptop vendor and graphics chip vendor are asking you to shell out unreasonable money for a "workstation-class" chip, one of their main justifications is top-notch driver support. The M6500 isn't even out of warranty, and Dell+ATI (since they teamed up on that combo) aren't supporting current versions of the Linux kernel

            • Don't you have the original driver that your system came w/? What happened to make it suddenly not work - did you do kernel updates, or change the DE or DE version, or something like it?
              • Dell didn't ship a Linux driver with the laptop.

                • If they promised "Linux support", but failed to supply Linux drivers, I would consider that mis-selling. I'd say you were owed a full refund there.

                  • If they promised "Linux support", but failed to supply Linux drivers, I would consider that mis-selling. I'd say you were owed a full refund there.

                    I wish, but I have to disagree. It wouldn't be accurate for me to say they provided no Linux support at all. For example, I've run into a number of issues, and they never gave me any hassle just because I was running Linux. (All of my issues where hardware-related, so I never tested their willingness to sort of Linux-specific issues.)

                    In this case, I'd say they haven't supported Linux as excellently as I'd have liked. Really good Linux support would have meant have meant ensuring that ATI's proprietary d

          • Blame whoever broke binary compatibility with the existing driver.

            That's not how it works. Windows would have the same problem; upgrade to a new Windows and now your old graphics driver doesn't work. The difference is that ATI supports their old cards longer in their Windows driver than they do in their Linux driver. ATI brings out a new driver for a new version of Windows, they bring support for the old cards to it. ATI brings out a new driver for a new version of Linux, and they don't support the old cards. In this case, it is only appropriate to blame ATI. ATI is actua

    • by HJED ( 1304957 )
      likewise, for me the upgrader removed every single entry from grub so that I had to work out how to boot using the grub comandline (which I had no idea how to do for a linux kernel) eventually I managed to get into a busybox shell and chroot to run grub-install and grub-update, but most users who experience such an error would have been left with an unusable system.
      It also unistalled my window manager, kdm, but I reinstalled that before I rebooted.
    • by Pausanias ( 681077 ) <pausaniasxNO@SPAMgmail.com> on Friday October 19, 2012 @11:51PM (#41712651)

      I learned the hard way that non-LTS Ubuntu releases are alpha software. LTS releases are beta software on release day. Wait for the .1 release of LTS and you've got a good stable system.

      The biggest problem with installing non-LTS is that any bug reports are fixed in the NEXT version and they don't give a damn about the the version you're actually reporting from. THEY treat it as alpha, therefore you should not be surprised.

      -Written from 12.04.1

      • Unfortunately I think that even the LTS releases are alpha-quality software. If a recent version of Windows or OSX was released that buggy, if would be a catastrophe for the company.

        Now, I actually like Ubuntu very much and am glad that they are releasing new versions periodically, even if they weren't perfect. As Steve Jobs has said, "real artists ship". But if desktop Linux some day really makes it big, there has to be much more robust quality assurance systems in place. We'll see.

      • Just avoid Ubuntu. I have one machine running Ubuntu, which is a little ARM-based laptop. Unlike most other machines, Canonical was paid by the manufacturer to support Ubuntu on this device. So how did that work? Well, on boot udevd jumps to using 100% of the CPU and the only way of getting a vaguely responsive system is to kill it. Unity is painfully slow and uses up so much RAM that even running a single large application (e.g. LibreOffice or FireFox) causes thrashing. Oh, and most of the system dia
    • Ah ok, we'll see next Monday: my story is that I tried upgrading an existing 12.04 and all seemed to go well until the installed started complaining that I had chosen to "hold back broken packages". Once it rebooted grub barfed and dumped me to its command line.

      My guess is that the new package manager took half hour to abort the install or run through the b0rked list of packages, broke the previous one in the process and reboot, so that now I'm left staring at the old grub deploy.

      It's kind of annoying

    • I gave up on multiple monitors in Linux until the next massive wave of X updates. Which is sad, because I have two 20" displays right next to one another...

    • If you want fewer bugs, then Ubuntu LTS is really the way to go. Those LTS releases are expected to be relatively stable for 5 years. When you are on the quick release cycles, anything can happen. This is the same principle between stable / testing / unstable with Debian. When you are on the bleeding edge, things break. When you are using the stable version, you should be able to expect that very few things will ever break. I wish more Ubuntu users paid attention to this principle, especially during the ear
      • by rjha94 ( 265433 )

        If you want fewer bugs, then Ubuntu LTS is really the way to go. Those LTS releases are expected to be relatively stable for 5 years.

        I am not sure that 12.04 LTS is that rock stable. I installed 12.04 on rackspace (using their image) and mysql refused to start because of some AppArmor bug. If you search launchpad you can get that bug. Now mysql is a big and fairly known package and lot of people would be using it on server. Now I understand the rationale of "it will be fixed soon", "someone already has the hack" and "you fix it, you did not pay for it". However just imagine how surprised you would be it it were an LTS release. I do not t

    • I'm using 12.04 with dual monitors right now, and it works OK. Did they break something in the new version?

      (In case it helps- if you're using the Nvidia proprietary drivers, you need to use the Nvidia settings menu, rather than the default "Displays" menu, in order to configure dual monitors. Not sure why, but there you go)

  • Mini-mod me (Score:4, Insightful)

    by noobermin ( 1950642 ) on Friday October 19, 2012 @06:02PM (#41710995) Journal

    Before you mention a) your technical problem with 12.10 b) your disgust with unity c) your leet alternative of cinammon/openbox/awesome/i3/dwm/twm/tmux/screen/tty2, can we save those for the appropriate forums or articles? This article is about Ubuntu becoming more closed, not about unity specifically or otherwise.

    • I absolutely adore how you actually got the alternatives in some sort of logical order.

    • Re:Mini-mod me (Score:5, Insightful)

      by Anonymous Coward on Friday October 19, 2012 @06:24PM (#41711169)

      Yes, and no.

      Like it or not, the 'off-topic' flood shows what really bugs people about the topics Ubuntu and Shuttleworth.

      Holding too close to 'the topic' can make the forum too much into an audience for press releases.

    • by Anonymous Coward

      Before you mention a) your technical problem with 12.10 b) your disgust with unity c) your leet alternative of cinammon/openbox/awesome/i3/dwm/twm/tmux/screen/tty2, can we save those for the appropriate forums or articles?

      Techdirt has a CwF+RtB, which means: Connect with Fans & Reason to Buy that I think is very appropriate prism to view this entire thing through, and also your comments. The underlying issues with Ubuntu is that it is losing this concept and not giving either of these anymore.
      a) Technical issues => negative reason to buy
      b) disgust with unity => negative connecting with fans
      c) alternatives => fans going elsewhere to get what they need

      This article is about Ubuntu becoming more closed, not about unity specifically or otherwise.

      Actually, I'd argue that this article IS about Ubuntu being cl

    • Re:Mini-mod me (Score:4, Insightful)

      by marcosdumay ( 620877 ) <marcosdumay.gmail@com> on Friday October 19, 2012 @08:18PM (#41711863) Homepage Journal

      Your post goes only to show that Ubunu clearly have image problems. In a way, they always had, but it was restricted to the same people that care if it is getting less open now. Nowadays, they've annoyed so many people, that the old time haters are outspoken, and can't even have a coherent conversation between the newby haters.

      Ok, inside the topic, people always complained that Ubuntu was too closed, and that it was getting even more closed, except for a small period, when they started to cooperate with Debian. In fact, it doesn't seem to be getting more closed, it installs closed softwre by default (it has always done that), it mixes closed software with proprietary in their repos (again, as always), it installs software with a big risk of being sued for infringing patents by default (not new), it gets money from private entities (as always), it customizes a few things the way its patrons like (that's new, it used to inherit patronized customizations).

      Personaly, except for installing too much closed software by default, I don't care about any of the above. And even the proprietary software, I care about it mostly because it is low quality, and wouldn't care if I could just ignore it.

  • There has to be a way to make this statement more clearly. The less coupled with the NOT is too close to multiple negatives for my lightning fast reads. "This would mean that there was even less of Ubuntu that was NOT shaped and polished by folk other than Canonical â" a move that one would think would be well received."
  • by Culture20 ( 968837 ) on Friday October 19, 2012 @06:14PM (#41711073)
    ...to closed source software. And incestuous design methods. And to advertising money.
    • And now I must ask. If it is becomming more open to closed source software, is it more or less open?

      Because I can think of some arguments for "more", some arguments for "less", but I'm tending to aswer "the two aren't related at all".

  • You are not less perfect than Lore

  • Ubuntu is not becoming more or less open or closed. It's always been as it is, the SABDFL's distro. Thus it was brown, now it is bruise coloured; thus it was Warty, now it is cool to find Amazon suggestions in searching for files and applications. Ubuntu's main problem is that Mark says things. He should just do what he does and have another person speak for Ubuntu who won't have to "correct misperceptions" because they won't actually know what Mark is doing and so can just say nice things.
    • While I'm not a fan of some of his decisions, I don't think his autonomy is a bad thing. SABDFL's distro is there is you want to use it. If you prefer a more communal approach to software bundles, there's Debian. The whole point of Ubuntu is to take Debian and give it a different focus, and I think it serves a very valid purpose. Its popularity is a testament to that.

      • by Anzhr ( 1132621 )
        Yes, I think so too. The kerfuffle is beside the point. Mark will do as he wishes, as openly and as not, as he wishes. And even if one doesn't care for it, Ubuntu still is a pretty good version of Debian unstable to roll one's own with Fluxbox or Openbox. Or to use in its Xubuntu form.
  • Unity is now too slow to run inside VirtualBox, even with Hardware Acceleration and the Guest Additions
  • I'm replacing Ubuntu with Debian! ... oh wait, I already did that like a year ago. But I'm even more glad about that decision now.

  • After trying to use the beta, and now release, and after months of fighting Unity in the the prior versions: I got so fed up that I actually started creating my own OS from scratch! Well, from Assembly... Initially anyway.

    First I made a Hex editor for RAM (in under 446 bytes) that can call into the edited memory. I wrote that to a USB drive, plugged it into a spare computer which is now Dev Machine Zero. After booting the MBR hex editor I created a "Save RAM Segment to Disk" by manually inputting binary op codes (machine code). Once I could save my work from RAM to disk, I began work on a simple 2 stage chaining boot loader -- It already lets me multi-boot and supports my extensible hash-based encryption [project-retrograde.com], which I use for signing/decrypting the 2nd stage loader and primordial kernel. As soon as I'm done implementing keyed SHA3 I'll use it to support full drive encryption at boot. It been little over a week of evenings and my bootstrap loader now replaces GRUB on all my systems. I'm also about 1/4th of the way through my new assembler language (it's currently a subset of 8086 only); When it's done I'll extend the Assembler using itself to support macros and finally begin bootstrapping myself into a compiler for a higher level language, like C (or maybe a C-ish lang of my own design).

    I sometimes do low level work on custom embedded systems programming, so I know a bit about OS development / design. I could use a cross compiler and/or a VM in a host OS, but I where's the fun in that? Besides, I can PROVE my bootstrap and compiler process didn't inject any back doors (as in Ken Thompson's Trusting Trust [catb.org]). There simply was no room for back-doors; I can "trust no one" because every last byte is accounted for.

    It's been forever since I wrote any Real Mode code; Ah fond memories: Outputting MOD files to the PC speaker, low res 320x200 256c graphics, direct disk IO, 640K + "High Memory"... I'll almost be sad to make the switch into Protected Mode and write the device drivers & file systems.

    Well, Thanks Ubuntu! I've had this idea for an Agent oriented OS kicking around for a while -- If it weren't for your usability failures pushing my frustrations over the edge I would still just be thinking, "Any idiot could do better than this!" instead of actually giving it a shot. Also, to all those "why re-invent the wheel" types: When's the last time you saw a wagon wheel on a sports car, eh?

    I'm still a loyal NetBSD & Slackware luser, but screw Ubuntu. I still have to use Ubuntu for testing packaging of my other projects, but instead of fighting the UI or glitches now I just take a deep breath, get a fresh cup of coffee and add a new feature to the only OS developed with my usability in mind.

    • One of the other commenters quoted Linus, "Talk is cheap, Show me the Code". I have that on my coffee mug right now :-)

      Unfortunately most of my work is in machine code and although it works for me, it isn't well tested since that's not my main concern just yet. When I get to the point of having more stable code in a more readable form than raw op-codes or half-implemented ASM, then I'll be sure to release it. Until then, here's the raw memory bootable hex editor I mentioned: Hexabootable [vortexcortex.com]. This one ha

  • After the numerous bungles with Unity, Amazon, and other decisions maybe Mark has learned something. Having outsiders inside would help reveal mistakes before they become mistakes. That would likely solve the publicity problems facing Ubuntu.

  • So...I understand if some of these practices are not typical for the open source development model. Like putting more emphasis on donations and sponsors, and having a closed core developer group.

    But maybe these are just damn practical moves. Maybe the extra cash will help ironing out the horrible amount of bugs, and improve the performance and hardware support. Maybe having a controlled development team will help having a clearer focus on technology and design, without there being million APIs and UIs fight

  • The approach used with Ubuntu angers a lot of Linux and GPL fans because it's not in the spirit of the GPL. BSD folks accept that someone can use their code for whatever purposes including making money on closed source software. We're OK with this. From a view of the project's culture and licensing considerations, it doesn't make sense that ubuntu is a linux distro at all.

    If you're a GPL person, I think you should be annoyed at what they've done. Ubuntu is clearly a business and meant to be monetized. T

  • by Anonymous Coward

    Not only Ubuntu, but also e.g. NVIDIA make the same mistake:

    It's _us_, the geeks, that install and recommend software (and hardware) for all our friends, friends of friends and our companies.

    I don't like Ubuntu anymore simply for this statement that they _want_ to abuse my friends brains for their advertisments, so the next 100 linux installations won't be Ubuntu anymore but probably plain stable Debian from now on. .. just like all the PCs I recommend to friends don't contain NVIDIA but integrated Intel gr

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...