Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Software Ubuntu Linux

Linux Letting Go: 32-bit Builds On the Way Out (theregister.co.uk) 378

An anonymous shares a report on The Register:Major Linux distributions are in agreement: it's time to stop developing new versions for 32-bit processors. Simply: it's a waste of time, both to create the 32-bit port, and to keep 32-bit hardware around to test it on. At the end of June, Ubuntu developer Dimitri Ledkov chipped into the debate with this mailing list post, saying bluntly that 32-bit ports are a waste of resources. "Building i386 images is not 'for free', it comes at the cost of utilising our build farm, QA and validation time. Whilst we have scalable build-farms, i386 still requires all packages, autopackage tests, and ISOs to be revalidated across our infrastructure." His proposal is that Ubuntu version 18.10 would be 64-bit-only, and if users desperately need to run 32-bit legacy applications, the'll have to do so in containers or virtual machines. [...] In a forum thread, the OpenSUSE Chairman account says 32-bit support "doubles our testing burden (actually, more so, do you know how hard it is to find 32-bit hardware these days?). It also doubles our build load on OBS".
This discussion has been archived. No new comments can be posted.

Linux Letting Go: 32-bit Builds On the Way Out

Comments Filter:
  • by Sasayaki ( 1096761 ) on Tuesday July 05, 2016 @01:48PM (#52450469)

    I think that the trouble finding testing hardware is quite telling.

    Can end users even buy a new, off-the-shelf 32-bit system these days, except for specialized devices like embedded systems?

    Is there anything more than a relatively tiny fraction of the user base that is stuck on 32-bit hardware, that can't use virtual machines to run that software on something that's not a potato?

    And I mean, it's not like the old 32-bit versions of OS's are gone. Windows 95 is still around. It didn't go away. I'm willing to bet there are still Windows 95 machines running somewhere in mission critical systems in places around the world.

    Yes, there's no security updates, but just unplug it from the internet and you're safe from the vast majority of attacks, and if you're worried about local access to your Windows 95 machine... install a thicker door?

    At some point technology has to move on.

    • by bugs2squash ( 1132591 ) on Tuesday July 05, 2016 @01:57PM (#52450577)
      I don't happen to know, maybe someone has figures. But I would have thought that the 32-bit x86 embedded linux market was quite large.
    • Can end users even buy a new, off-the-shelf 32-bit system these days, except for specialized devices like embedded systems?

      While 32-bit embedded is certainly a huge market, the use for 32-bit Ubuntu isn't new off-the-shelf systems, it's grandma's old XP machine that needs rescuing. Embedded gear is more likely to run a lite debian.

      But, perhaps Ubuntu isn't the right choice for Grandma at this point. Probably more people would go for Mint anyway (they're more likely to be impacted by Ubuntu's decision). M

    • by l2718 ( 514756 ) on Tuesday July 05, 2016 @02:04PM (#52450683)

      The problem is not newly-bought consumer electronics or legacy software. The problem is legacy hardware. I'm still using the Thinkpad I bought in 2006 (4:3 aspect ratio display). Luckily it's a 64-bit processor, but others have older 32-bit machines.

      It's also not about the kernel -- Linux itself will support 32-bit architecture for a long while more, and most software will compile correctly on both 32-bit and 64-bit, though it will be less and less true as distributions stop their QA and you are left with only the upstream development team.

      Of course, these old machines are pretty few, so it probably does make sense for Ubuntu to drop 32-bit packages. Other more enthusiast-targeted distributions will probably keep 32-bit support. In particular Gentoo compiles everything locally.

      • CERN still have plenty of crusty old hardware. They produce a 32-bit version of Scientific Linux for that reason. With a bit of faffing, you can even get it to run on non-PAE processors.

    • For computers? Quite some time. There was a one-off Atom netbook chip back in 2008, and before that was Core (the predecessor to the more popular, 64-bit capable Core 2) in 2006-2007 and some of the early Pentium 4s up to 2005. On the AMD side, you have to go back to K7, which stopped being made in 2005. So everything that you'd want to run a desktop distro on is at least eight years old.

      Intel did make x86-32-only chips for smartphones until much more recently, but you wouldn't want to run a desktop distro

    • by jandrese ( 485 )
      Aren't most Atom chips IA32? While this is technically "embedded systems", in practice you generally run a full Linux distro on it.
    • How would running a 32-bit program in a virtual machine on 64-bit hardware help? You would still need a 32-bit OS, right?
    • "I'm willing to bet there are still Windows 95 machines running somewhere in mission critical systems in places around the world."

      As of about 5 years ago there was one in a paper mill I was working at. Since they were only using it because there was no new drivers for the thing it was plugged into I'm willing to bet it's still there unless it broke.

      The IT guy had a copy of Windows 1.0 in a drawer in the server room too.
    • by Darinbob ( 1142669 ) on Tuesday July 05, 2016 @04:46PM (#52452175)

      What the hell? 32 bit cpus are everywhere. The article is talking about PC builds, x86 clones in other words, only a Wintel person actually thinks that is the only arcthitecture out there. Meanwhile if you look at the Linux kernel it has 29 different architectures it supports.

  • ... though I suppose one can continue to run an old version. And it's not reasonable to expect open source volunteers to do double work. But it is still a loss.
  • Why not just drop the boot 32bit part and only have the compact 32bit libs. Like how windows server 2008 and newer is on the windows side.

    Why cut off apps that can run today on a 64bit system with out needing any vm bs.

    • Why not just drop the boot 32bit part and only have the compact 32bit libs. Like how windows server 2008 and newer is on the windows side.

      Why cut off apps that can run today on a 64bit system with out needing any vm bs.

      If you had read TFA, you might have found out that's exactly what most Linux distros are doing. 32-bit library support isn't going anywhere, just .ISO builds for the i686 and older.

    • That is the plan as described by TFA.

  • 32-bit != i386 (Score:5, Insightful)

    by Anonymous Coward on Tuesday July 05, 2016 @02:07PM (#52450703)

    Posts like this always confuse me. The terms i386 and 32-bit are not interchangeable. AFAIK, they were only talking about getting rid of i386 architecture (i.e. 20+ year old 32-bit hardware), but would maintain i686 (more recent 32-bit hardware) support.

  • by Eravnrekaree ( 467752 ) on Tuesday July 05, 2016 @02:18PM (#52450829)

    So much for Linux being "great for old hardware". This is really just an dubious move by distros and really just ignores a huge area where Linux can see use: Old hardware where Windows wont run. You also have another aspect of this which is your basically trashing 32 bit app support if you do not include 32 bit libraries, or, providing a thunk between 32 bit apps and 64 bit libraries.

    Even if 32 bit libraries are not built, you should be able to run a 32 bit app by compiling the libraries yourself, so distros could at least allow people to build 32 bit libraries easily from source packages, (with the benefit of automatically building all dependancies).

    Another area this will create problems is with VMs on even recent hardware, Intel chips up to just a year or two ago didnt include VT-x or a Ring 2, which means that virtualization of 64 bit OSs will not work.

    • by Groo Wanderer ( 180806 ) <charlie@@@semiaccurate...com> on Tuesday July 05, 2016 @02:29PM (#52450959) Homepage

      You might want to think about what you just said, or read the blurb of an article you are commenting on. It specifically states "Major Linux distributions" which are not what tend to support ancient, embedded, long life, or related non-consumer/non-traditional server workloads. In short there are tons, hundreds likely of distros that will cater to 32-bit and even 8/16-bit hardware because that is all that is needed for the job they do.

      Go look at Linaro's work, it isn't technically a distro but is supports some pretty 'craptacular' hardware, at least by modern user perspectives. How long do you think your router can live with 'only' 32b SoCs? Do you think DDWRT will get a massive boost from 64b code? How about your dishwasher? There are distros that cater to all those markets and they are not moving to 64-bit only.

      In short nothing will change for 99.(big number) of users, those that need 8/16/32b code will still have distros to do it. Anyone wanting to run those distros as a modern desktop or server, well, enjoy it but I am not a masochist so I won't be joining you. For every one else, carry on, you won't notice anything but better wares and cheaper devices.

  • I don't see the problem here. While it may be tedious, it's not technically difficult to build your own 32-bit packages from source, at least most of the time - the (64-bit) package maintainers have done most of the hard work already, identifying dependencies and whatnot. You may occasionally have to do some troubleshooting, especially as time goes forward - but is this sort of thing where the Linux community really shines.

    Heck, you might even learn something.

    • by guppysap13 ( 1225926 ) on Tuesday July 05, 2016 @02:35PM (#52451029)
      This is certainly a place where *nix excels. I've started mucking around with an old Powerbook G4 because it's easier to carry around than my main workhorse. Debian, Gentoo, and FreeBSD all run on it happily even though it's hard to find new hardware to test on. Gentoo and FreeBSD treat ppc32 as a "second tier" platform - they'll still auto-generate the installers and configure package dependencies, but they won't check for errors during the build, and bugs in ppc32 won't delay a new release. It's up to users to submit bug reports/patches or fix issues as they come up. Transitioning i386 to this level of support is far from the end of the world.
  • by allquixotic ( 1659805 ) on Tuesday July 05, 2016 @02:30PM (#52450975)

    We at least need enough 32-bit packages available in the 64-bit distro (whether by dpkg --add-architecture i386, or by installing "lib32" packages like we used to do) to install and run Wine.

    You see, to run Win32 programs, your Wine emulator binary needs to be a 32-bit Linux/ELF application. I suppose it could emulate cross-architecture, but wine prides itself on *not* emulating native code generation (for performance). Otherwise it would be as slow as a software virtualization solution like Bochs or (non-KVM) qemu.

    Wine, in turn, depends on a number of system libraries for core services. It then implements common Windows APIs "in terms of" available platform libraries. Direct3D in terms of OpenGL; DirectSound in terms of libasound2 or libpulse; etc. These libraries, linked into a 32-bit binary, must also be 32-bit.

    I agree that there's no point in testing 32-bit *hardware* any longer, but I hope they continue to ship 32-bit *builds* (even if they stop making 32-bit installation CDs). There's just too much software on the Win32 platform that needs to run on Linux (desktop OR server; see game servers) to abandon this segment of the market.

    • by LichtSpektren ( 4201985 ) on Tuesday July 05, 2016 @02:41PM (#52451077)
      Read TFA. Nobody's killing 32-bit libraries. Only .ISOs for 32-bit CPUs.
      • by fgouget ( 925644 ) on Tuesday July 05, 2016 @05:08PM (#52452343)

        Read TFA. Nobody's killing 32-bit libraries. Only .ISOs for 32-bit CPUs.

        The fine article says

        His proposal is that Ubuntu version 18.10 would be 64-bit-only, and if users desperately need to run 32-bit legacy applications, the'll have to do so in containers or virtual machines.

        This suggests it's not just the ISOs that they plan to get rid of but also support for 32-bit applications, which includes Wine (for running 32-bit Windows applications). So yes, that's pretty worrying for Wine as a lot of Windows applications are either still 32-bit only, or depend on a 32-bit installer. Furthermore, one of the great advantages of Wine is that you do away with all the annoyance that are VMs. So using "containers or virtual machines" is really not much of a solution.

        • by stevied ( 169 )
          To run 32 bit apps in a container, you'd still need 32 bit libs. So it implies some support would be kept..
  • do you know how hard it is to find 32-bit hardware these days?

    What's there to "find"? You can kick-off a 32-bit VM under any hypervisor — both on the cloud or on your own desktop. You can automate the VM-creation and tear-down on your build-farm quite easily.

    I too strongly prefer to have a system, where size_t is equal to off_t (so you could mmap an entire file and not worry about it), but that is not "free". 64-bit pointers are, obviously, twice-wider than 32-bit ones, so "hairy" structures — with lots of pointers in them — nearly double in size. If none of your processes require more than 4Gb of virtual memory, there is no reason — other than the developers' laziness — to go 64-bit.

    Whether it is an OS embedded inside a router or a point-of-sale machine, or even a single-user web-and-email desktop, 32-bit is perfectly sufficient and the overhead of 64-bit not justified.

    And that laziness is what is keeping us back... Over the last 18 years, according to Moore's law [wikipedia.org], our computers have become at least 2^12 times more powerful. Now ask yourself, is the user-experience — however you choose to measure it — 4096 times better than it was in 1998? And, if it is not, where did the gains in hardware go?

    By refusing to setup/use tens or even hundreds of 32-bit test-systems, developers force thousands and millions of users to upgrade. That is not a fair trade-off.

    • by Burdell ( 228580 ) on Tuesday July 05, 2016 @03:28PM (#52451485)

      If none of your processes require more than 4Gb of virtual memory, there is no reason â" other than the developers' laziness â" to go 64-bit.

      First, addresses/pointers aren't normally the largest chunk of code or data memory usage, so the include in RAM usage is far less than double.

      Also, in the specific case of the Intel x86 architecture (which is what this is about, not general 32 bits vs. 64 bits), there is a significant reason to move from i386 to x86_64. The i386 architecture has a very small CPU register set, compared to most modern CPU architectures (and some instructions can only use certain registers). That means lots more things require memory loads/stores, which is bad for performance. When AMD created x86_64, they added a bunch of registers (and got rid of most of the usage restrictions), so 64 bit code performance is better.

  • by kwerle ( 39371 ) <kurt@CircleW.org> on Tuesday July 05, 2016 @02:45PM (#52451123) Homepage Journal

    There are a bunch of desktop distributions that will no longer do 32 bit builds. Makes sense.

    No effect on kernel or disties for 32 bit systems/embedded/etc.

  • by watermark ( 913726 ) on Tuesday July 05, 2016 @03:58PM (#52451789)

    I felt a great disturbance in the Force, as if millions of Intel Atom netbooks suddenly cried out in terror and were suddenly silenced. I fear something terrible has happened.

    AKA, my netbook :(

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...