Should There Be an 'Official' Version of Linux? (zdnet.com) 283
Why aren't more people using Linux on the desktop? Slashdot reader technology_dude shares one solution:
Jack Wallen at ZDNet says establishing an "official" version of Linux may (or may not) help Linux on the desktop increase the number of users, mostly as someplace to point new users. It makes sense to me. What does Slashdot think and what would be the challenges, other than acceptance of a particular flavor?
Wallen argues this would also create a standard for hardware and software vendors to target, which "could equate to even more software and hardware being made available to Linux." (And an "official" Linux might also be more appealing to business users.) Wallen suggests it be "maintained and controlled by a collective of people from users, developers, and corporations (such as Intel and AMD) with a vested interest in the success of this project... There would also be corporate backing for things like marketing (such as TV commercials)." He also suggests basing it on Debian, and supporting both Snap and Flatpak...
In comments on the original submission, long-time Slashdot reader bobbomo points instead to kernel.org, arguing "There already is an official version of Linux called mainline. Everything else is backports." And jd (Slashdot user #1,658) believes that the official Linux is the Linux Standard Base. "All distributions, more-or-less, conform to the LSB, which gives you a pseudo 'official' Linux. About the one variable is the package manager. And there are ways to work around that."
Unfortunately, according to Wikipedia... The LSB standard stopped being updated in 2015 and current Linux distributions do not adhere to or offer it; however, the lsb_release command is sometimes still available.[citation needed] On February 7, 2023, a former maintainer of the LSB wrote, "The LSB project is essentially abandoned."
That post (on the lsb-discuss mailing list) argues the LSB approach was "partially superseded" by Snaps and Flatpaks (for application portability and stability). And of course, long-time Slashdot user menkhaura shares the obligatory XKCD comic...
It's not exactly the same thing, but days after ZDNet's article, CIQ, Oracle, and SUSE announced the Open Enterprise Linux Association, a new collaborative trade association to foster "the development of distributions compatible with Red Hat Enterprise Linux."
So where does that leave us? Share your own thoughts in the comments.
And should there be an "official" version of Linux?
Wallen argues this would also create a standard for hardware and software vendors to target, which "could equate to even more software and hardware being made available to Linux." (And an "official" Linux might also be more appealing to business users.) Wallen suggests it be "maintained and controlled by a collective of people from users, developers, and corporations (such as Intel and AMD) with a vested interest in the success of this project... There would also be corporate backing for things like marketing (such as TV commercials)." He also suggests basing it on Debian, and supporting both Snap and Flatpak...
In comments on the original submission, long-time Slashdot reader bobbomo points instead to kernel.org, arguing "There already is an official version of Linux called mainline. Everything else is backports." And jd (Slashdot user #1,658) believes that the official Linux is the Linux Standard Base. "All distributions, more-or-less, conform to the LSB, which gives you a pseudo 'official' Linux. About the one variable is the package manager. And there are ways to work around that."
Unfortunately, according to Wikipedia... The LSB standard stopped being updated in 2015 and current Linux distributions do not adhere to or offer it; however, the lsb_release command is sometimes still available.[citation needed] On February 7, 2023, a former maintainer of the LSB wrote, "The LSB project is essentially abandoned."
That post (on the lsb-discuss mailing list) argues the LSB approach was "partially superseded" by Snaps and Flatpaks (for application portability and stability). And of course, long-time Slashdot user menkhaura shares the obligatory XKCD comic...
It's not exactly the same thing, but days after ZDNet's article, CIQ, Oracle, and SUSE announced the Open Enterprise Linux Association, a new collaborative trade association to foster "the development of distributions compatible with Red Hat Enterprise Linux."
So where does that leave us? Share your own thoughts in the comments.
And should there be an "official" version of Linux?
Relevant XKCD comic (Score:5, Interesting)
You're already picturing it.
It'd be wonderful to see some forward progress in the Linux community in the form of unification - in a way distros are already kinda showing similarities with the move towards rolling release spearheaded by Arch, and in a way now Valve.
It's an exciting time. Dare I say... the year of the Linux desktop?
Comment removed (Score:5, Insightful)
Re: (Score:2)
Re: Relevant XKCD comic (Score:4, Insightful)
unified with corporate overseers where would quickly become locked down and everything we left the other OS's to get away from. Can't have it all, but it becomes unified we won't have anything.
and it would be a bloated version with all sorts of ways to do things because no one could agree on one way and so added in everyone's wishes in order to get theirs in. Changes, BTW, would take years because no one could agree on what changes were needed and how to implement them.At least naming it wold be easy "Linux by Committee; Everything you want and so much more..."
Re: (Score:3)
A government, any government, should have enough resources to take a version of Linux, fork it and call it $goverment Linux. There you go.
Not quite. I'm expecting the government to have an OpenBSD sort of approach. Continuous security surveys and improvements. Proactive, not reactive. Something more involved than SELinux.
Re:Relevant XKCD comic (Score:5, Interesting)
It's an exciting time. Dare I say... the year of the Linux desktop?
When I look around me at those using the internet, I see an endless sea of portable devices. It actually annoys me when I see so many trying to navigate a website with a 5" screen. Some even own laptops and desktops, but the excuse is always a matter of convenience. And almost every one of those devices is running some form of Not-Windows.
Ironically enough, I feel we're already in the "Year" of the Linux Desktop. Just took so damn long we got rid of the desk and replaced it with a pocket.
Re: Relevant XKCD comic (Score:2)
Re: Relevant XKCD comic (Score:4, Insightful)
Most leisure usage of computing devices is just consuming content - scrolling a browser, watching videos etc. PCs are losing ground because you can do that just fine on phones and tablets. Unless you're into productive activities or gaming, you can probably live without a PC.
I think you're missing the point. The massive popularity of not-a-PC OS right now, is demonstrated in almost every portable "smart" device out there.
Ironically enough, *NIX/BSD spent 30 years trying to make it to the (corporate) Desktop, and instead made it into damn near everything else instead. Maybe it was never truly meant for the desktop, except to develop more versions to shove in every not-a-desktop...
This just illustrates what the real problem is. (Score:3, Insightful)
and instead made it into damn near everything else instead
Why did it make it into everything else? Because that everything else consists of a) Android, which is controlled by a single entity which enforces compliance to standards, or b) one-off single or special purpose devices that don't need to conform with each other - things like embedded controllers, smart devices, printers, etc.
Maybe it was never truly meant for the desktop
It's not a question of meant to. It's a question of doing what is necessary to make it happen.
The problem is the diversity in user and software API. A diversity in user interface i
The problems are inherent to the model (Score:2, Insightful)
> That set us back fifteen years of desktop adoption.
The actual things keeping linux off most desktops are:
1: Lack of a stable, consistent, open, free OS API. USB; sound; inter-application messaging; windowing and all its components such as desktop widgets, toolbars, menus, line drawing, fonts, images, icons, sprites, multimedia, networking, statistics, password management, etc. for quite a long list. Some of this is there and consistently supplied with every linux, but a lot of it isn't and resides (or
Re: (Score:3)
The GPL
... has not been the least impediment to making the Linux kernel the most used kernel on the planet. The Linux kernel is probably on your phone, and one or more of your TVs. There's a good change it's in your printer. It's almost certainly in your router. These days it might be in your microwave and/or washing machine.
Any suggestion that the GPL promotes commercial adoption of the kernel and yet somehow impedes desktop is just nonsense.
API stability and consistency are absolutely impediments though.
Back-
Re: (Score:2)
I'd be willing to bet that consuming content is all a lot of desktops are used for as well. I know personally this weekend I've made a few forum/discussion posts (including this one) and that is about it. I've also watched over an hour of you tube, doom scrolled my facebook feed, read a book (PDF file of one anyway) and it is still pre-dawn (barely) on Sunday morning. Still got almost 24 hours of weekend to go, though I may be going fishing all day.
Granted, producing content on a "real" computer with a "
Re: (Score:2)
A linux that is fully cooporating with the phones is a must, most of the people i have helped only need internet and an office pack, so often a chromebook is enough, otherwise an old macbook
The reasons to avoid linux for those i have helped has always been confusion about the different interfaces and terrible problems with printers, using mac solves this by having a coherent design standard that people can relax with, windows changes too much, and linux is a sea of variable interface distros that causes hea
Re: (Score:2)
The reasons to avoid linux for those i have helped has always been confusion about the different interfaces and terrible problems with printers, using mac solves this by having a coherent design standard that people can relax with, windows changes too much, and linux is a sea of variable interface distros that causes headaches
Few normal people need to customize, they just need to work
Few normal people need to get beyond a laserjet4 driver to print. Especially to work. How old is CUPS now? Printers are still Linux kryptonite after 30 years? Larger MFPs are standalone systems that can shit output in any network hole you want, and in many common formats (e.g. PDF).
If it's 2023 and the excuses are still "problems with printers", then I'd say you have your answer as to why Linux hasn't quite made it to the desktop. Especially the corporate desktop.
Re: (Score:3)
If it's 2023 and the excuses are still "problems with printers", then I'd say you have your answer as to why Linux hasn't quite made it to the desktop. Especially the corporate desktop.
Considering the amount of problems we're having getting printers to work after we moved from a distributed model to a centralized model, where one agency controls all print servers and print drivers, Linux can't be much worse.
In case you're wondering, this is all because someone had to justify their existence under ITIL.
Re: (Score:3)
Printers suck on everything except Linux. lpr is a gem.
Honestly it is sad I hardly have to print.
Re: (Score:3)
Printing issues are usually confined to the cheap home user printers which are proprietary and entirely software driven to reduce costs.
The higher end printers used by most corporates tend to support postscript or pcl, and usually have no problem working with linux.
Re: (Score:2)
The reasons to avoid linux for those i have helped has always been confusion about the different interfaces and terrible problems with printers
Either buy a printer which is not manufactured by Satan himself (i.e. Brother) and have a life which doesn't suck on any OS or buy a different printer and be miserable trying to get the motherfucker to not work like shit on any OS.
Few normal people need to customize, they just need to work
This is one of the most bizarrely persistent memes in the world of computing. I
Re: (Score:2)
And if you can't win...move the goalposts! :D
Re: Relevant XKCD comic (Score:5, Insightful)
Re: (Score:3, Insightful)
*Rolling releases*. The *worst* thing for stability on the server side, and the *worst thing* for end users!
The "average" end user gets confused if menu items change. If options move, or disappear. If a ctrl-key combo becomes different, or does something different.
Yet with rolling releases, you're in an endless sea of 'new'. New options, no look, new theming, all thrust upon the end user without nary a warning! Rolling releases would be a NIGHTMARE for the average end user, the WORST THING EVER for a d
Re: Relevant XKCD comic (Score:2)
Android already exists too.
"Official linux(tm)" would also need drivers and a build for the hw.
Basically android ecosystem is exactly how it would go and its already going.
Auto Analogy (Score:2)
Should there be an official, standard, reference model 'car'?
Seriously? (Score:4, Insightful)
Wallen argues this would also create a standard for hardware and software vendors to target, which "could equate to even more software and hardware being made available to Linux
Most drivers are already in the kernel, which is standard across all Linux installations. I don't think it's too much to ask vendors to create 2-3 distributables for their software. After all, they usually provide software for Mac, Windows 32, Windows 64, and sometimes ARM. Providing an .rpm, a .deb, and a self installer should be pretty standard (and it usually is).
Beyond that, I really don't understand the purpose of "One Linux" or an "Official Linux". Canonical, IBM, and SUSE all have different priorities and behave differently. None of them are going to adhere an "Official Linux", because then there won't be anything that sets them apart. Would they even modify their software packaging to adhere to the same standard? I doubt it. And rightly so. Because people have different needs and priorities as well. And that's what's great about Linux. You're not locked into one "Windows 11" that's the same for all (sans home vs. professional). You get to pick what fits you the best. To me the idea of standardizing Linux is a step backward. The the flavors come and the users will reign in the winners, just like they do right now.
The reason vendors don't provide their software for Linux is not because it's more work to do or that it's hard. It's because there's not enough bang for buck. Increase the installation base of Linux and Adobe will release a version of Photoshop for Linux, just like vendors eventually decide to release a MacOS version of their software because of Apple's market share.
Re:Seriously? (Score:5, Interesting)
Hardware support is solved: upstream your driver, and you're good. Don't want to do that? Tough luck, stable internal driver API/ABI will not happen. Upstreaming is better for the end-users, and the world as a whole. The countless devices stuck (especially routers) on archaic versions of the kernel are a security nightmare.
I'll probably get downmodded to oblivion for this, but here goes:
Systemd did a lot to make linux "standardised" . When packaging, you create a unit/mount/network/timer file, ship it, and it works on all major distros. I co-maintain a proprietary package, and systemd allowed us to strip a lot of distro-specific cruft.
And it still allows the end user to be very much in control of the system.
The rest will never happen. That would require basically locking APIs and especially ABIs (that includes compile flags!) forever. And when breakage occurs, maintain a compatibility layer. MS does that. Hats off.
Docker, Snaps and flatpacks "solve" this by vendoring almost all libraries. That's an admin's nightmare: basically no way to scan the machine for known broken versions of binaries (.so + exe). Language package managers are similar, although not as dire.
Another option is targetting wine compatibility. Same problems as above apply.
This is why its called an ecosystem: stuff keeps moving, and you need to keep up.* If you collaborate with others, the ecosystem will help you. Locking something to a single point in time kills evolution.
*Note: Games. I'm highly conflicted about games in this regard, as those are the prime candidates to be written, released, forgotten. All kinds of emulators fill the compatibility layer gap for consoles, but native games will quickly get lost to history. Kudos to Valve's wine-proton project.
Re: (Score:3)
Games: ps5.
I'm currently playing Skyrim which is a 10+ year old game that has received several updates, fixes, dlc, etc. not compile, ship, forget.
Playing on console means all developers have the exact same known target good for many years. Between ps5 and mobile I haven't even turned on my pc in 18 months.
Re: (Score:2)
Heroes of Might and Magic III , Wizardry 8, Star Wars: Knights of the Old Republic. On PS5; go! I'm waiting. (Thank GOG for keeping them alive)
Tongue in cheek, of course. I just finished The Witcher 3 on PS5, and it was glorious, so I do get your point.
There isn't much value in running photoshop from 1998, but games keep their "value" the same way movies do.
> all developers have the exact same known target good for many years.
Which is exactly why good emulators are possible, and are easier than "regular"
Re: (Score:2)
It's windows 7. Are they still making updates? The scary thing would be if I turned it on and there weren't any....
Re: Seriously? (Score:2)
There already is (Score:4, Insightful)
Here's what you get with Linux distros (sans RHEL which is not advertised as a desktop distro, read below):
Now the question is why can't RHEL become a desktop distro? For multiple of reasons:
It's kinda astonishing that tech journalists can't get the basics right. It's not about "a new Linux distro", it's about fixing stuff which no one wants or does. This article [altervista.org] written in 2009 is still relevant as ever.
Re:There already is (Score:5, Informative)
Also it doesn't help that Linux is a constant state of flux:
Graphics: We had Xorg, now Wayland is all the range but it is completely incompatible with Xorg (XWayland basically allows to show a rectangle with an X11 app and send input events to it, that's it) and forces weird design decisions [freedesktop.org] which raise many eyebrows. And then there's a drama of Wayland being super fresh, new and forward looking while ... using implicit sync which was deprecated over 15 years ago and which is a major source of pain for graphics drivers in Linux.
Audio: OSS? Thrown away. ALSA? Turned out to be unmanageable, superseded with PulseAudio and now PipeWire (which thankfully is fully compatible with PA).
Hardware enumeration and access in userspace? HAL then devfs, now udev.
There's nothing similar or close to Win32 in Linux. You targeted GTK2? Well it's not maintained, go port to GTK3 and then GTK4. Qt? Qt2-3-4 are all deprecated/unmaintained, Qt5 is a life support.
Re:There already is (Score:5, Insightful)
I will agree, Wayland is not very good.
Somehow despite how allegedly hard X sucks, Wayland has taken over 15 years to fail to displace it and in the mean time they managed to miss many of the bigger problems.
They also "simplified" by pushing out the complexity elsewhere and calling it "out of scope". That does not of course simply the system, it at best slightly simplifies some of the components at the cost of adding complexity elsewhere.
The combination of Wayland +pipewire + a whole bunch of other libraries is certainly not simpler than X11.
Hardware enumeration and access in userspace? HAL then devfs, now udev.
udev dates from 2003. I don't feel that a 20 year old system represents too much churn.
Re: (Score:2)
The combination of Wayland +pipewire + a whole bunch of other libraries is certainly not simpler than X11.
I'm using pipewire with X11, but otherwise I agree with your assessment. (pipewire replaces both pulseaudio and JACK, and also supports ALSA clients!
Re:There already is (Score:4, Insightful)
Pushing complexity elsewhere by definition creates simplicity. Your use case should not require me to implement something I won't use in my system. Not only is it simpler to compartmentalise functionality, it is literally "The UNIX Way" (TM).
The combination of things I don't have installed for my use case is very much simpler. If you prefer the X11 approach then maybe Windows, an OS full of things you don't need, is more preferable for you.
Re: (Score:3)
Pushing complexity elsewhere by definition creates simplicity.
No it doesn't: by definition it maintains the complexity.
Your use case should not require me to implement something I won't use in my system.
It doesn't. X11 implements the infrastructure for tooling, screen capture etc once and forever. GNOME needs no integrations. KDE needs no integrations. XFCE needs none, and neither do FVWM, TWM, ratpoison, etc etc etc.
None of those need to implement anything for all of that to just work.
If you prefer the X
Re:There already is (Score:4, Interesting)
And yes, before someone complains, Wayland does network transparently fine with waypipe. waypipe ssh user@someserver
Pipewire is fully backwards compatible with Pulseaudio and ALSA. You can even use an ALSA mixer to set the volume on a pipewire system.
Microsoft creates and abandons GUI libraries all the time. Win32 was replaced by Winforms, which was replaced by WPF, which was replaced by UWP, which was replaced by MAUI, etc. GTK3 was released 12 years ago and is still supported.
Re: (Score:2)
Should waypipe mature from being a hack to be a standardized protocol, with proper forward and backward compatibility, perfect integration with Wayland (now this seems not be the case *), perfect interoperability of clients on the display side, and when is so widely deployed with universal support, it will basically be like X. Except now an incompatible version.
*) https://github.com/deepin-comm... [github.com]
Re: (Score:2, Funny)
That's why I don't own a car. Until cars are compatible with all my saddles and harness (trust me I TRIED) they aren't true horse replacements.
Re: (Score:3)
"Most" !~= "All", and there's always a long tail. Sometimes the most critical software for users is some niche application that can't be bothered to update to deal with new compatibility issues.
And sure, Microsoft might create-and-abandon libraries all the time. But all the old libraries still work.
Only Apple has the luxury of actually breaking compatibility with new versions of their OSes, and that still causes problems with people who use software from stodgier companies that aren't part of the Apple RD
Re: (Score:3)
Win32 is there and perfectly usable in Windows 11. It's not gone anywhere. Most Win32 applications from the Windows 95 era work perfectly in Windows 11.
GTK3 is on life support (version 3.24.38 should tell you everything) and can be dropped any time and when distros drop it, a ton of GTK3 software will become inaccessible for Linux user
Re:There already is (Score:5, Interesting)
Next to zero backward compatibility: most distros insist all software must be recompiled for the current version of a distro.
No they don't.
Regressions and bugs galore, kernel updates and GRUB (boot loader) updates rendering your system unbootable or malfunctioning.
What the fuck are you doing? There's like main Linux machines in my household (my laptop, SO's laptop and a shared desktop) plus some raspberry pi's. These get daily use. This is not a problem I encounter.
Spotty hardware support. It's a ton better than in the late 90s but it's still far from perfect since OEMs are only concerned with their HW working with Windows.
Hardware vendors are notorious for not providing drivers for newer versions of Windows. And these days standards have replaced drivers many common cases. IPP printers "just work", UVC webcams just work. Bluetooth speakers just work. etc. Plus wifi just works these days, the 3 main vendor graphics cards work (NVidia probably sells more Linux hardware than Windows these days given they're the only choice in the cloud).
Linux doesn't support some stuff, but these days neither does Windows. The days where it's a struggle to find enough working hardware for a basic setup are long, long gone.
And if you buy shitty hardware and expect it to "just work", well, back when I used to do friends and family tech support, that stuff is a nightmare on Windows as well. Shit hardware is shit. If you're spending a lot of money and rely on it for daily work, do your research before you buy regardless of which OS you run.
Go visit Arch Wiki
Oh now you're just being a knob. Arch is supposed to be a bleeding edge rolling release distro for people who like fucking with things. It's fun. I dip in and out every few years. The machine I run for work used to run ubuntu 18.04 , now runs mint. 21.1.
Re: (Score:2)
> Now the question is why can't RHEL become a desktop distro?
How to spot a troll :
https://www.redhat.com/en/stor... [redhat.com]
Re: (Score:2, Troll)
For general user desktop? No.
I shouldn't have to be an expert at car engine design and maintenance to drive my car down the street.
Re: (Score:2)
Re: There already is (Score:4, Informative)
That's not it. The problem with the command line for the general user is that s/he isn't using it day in-day out. It's full of arcane commands which they will never remember from the year previously they had used it.
Apple had a decent take on this years ago with MPW. It gave you a list of common commands from which you could choose or find more. If you selected a command, up would pop a modal dialog. All you need to do was check boxes, click radio buttons, enter file names, etc. and it would build the command line for you on the fly in a small frame. Each modal dialog was tuned to that specific command so you didn't get inundated with crap that had nothing to do with the command. You could then elect to execute the command right there with but a button press, or you could copy/paste it into a script.
Until the developers of guis for linux/unix spend the time and effort to get something that useful, no regular user is going to give a flying rat's ass.
No. (Score:2)
There is an official version of Linux (Score:4, Interesting)
It lives at https://git.kernel.org/pub/scm... [kernel.org]
Re: (Score:3)
And before you complain that that's already in the story: The story starts with the question if there should be an official version of Linux and ends with the question if there should be an official version of Linux, despite having a mention of the official version of Linux smack in the middle of the story.
Re: (Score:2)
I'm rather certain that when they say "should there be an official version of" they're talking about a distribution rather than a kernel. I've picked Debian as my "official distribution" for the last decade or so, without too many problems. (I'm not really pleased with systemd, but it hasn't caused too many problems. ... Well, not enough that I've more than considered moving to another standard. Like Devuan or Gentoo. [OTOH, I don't have a lot of test machines to try them out on.])
Re: There is an official version of Linux (Score:2)
That's what I was gonna say, the "official" version of Linux resides at kernel.org. It's an old truism that people still roll their eyes over. Linux is a kernel, not an operating system. Linus had chosen GNU as the operating system to support his kernel, the time was right, and GNU was already there, available for "any" purpose. Well, having a complete posix compliant operating system with source available gratis when you are developing a kernel is a win. People fail to remember this distinction and some ev
Re: (Score:2)
And this is the problem. You point to the "official link" that isn't an OS. You can't go there, download something and have it work.
The short answer is yes... (Score:5, Interesting)
The longer answer is that none of the current distros should be it.
Any official distro should be better than what we currently have. The good news is that the parts are available. The real question is, do we have the will?
An official distro MUST:
- Be fully integrated by default. All apps, all packages.
- Be designed for the home user -> small business user.
- Have use models that fit home, power users, and small businesses.
The distro should support a single NIC and a dual NIC design (dual NIC to isolate a useable network)
As I see it, a SAMBA4 domain would need to be at the center. Be it on a stand-alone WS, or on the home server.
All apps and packages would need to be packaged so as to automatically integrate them with the domain. This means that all SQL users would be added as SAMBA domain accounts. The CUPS server would need to be configured to use the SAMBA domain by default. The Apache web server would, by default, install with SSO. Drupal/Wordpress/whatever would, by default, install with SAMBA domain integration by default.
Users should be able to select their industry and get sane defaults for it. For example, if Bobby is setting up a system for a restaurant, Chromis (and I just picked a random OSS POS system) should be automatically set up for it. Chromis should then use the SAMBA directory for user log-in.
File servers, Remote access servers, VM server, web servers, LTSP cluster, applications servers, (you get the idea), everything that normally ships with this distro should support this integration.
By default, this system should have standard ways of doing thing. It should have use models for everything. This makes it a lot easier for users and powerusers to actually get things done.
Android/Iphone integration is a must.
And when I say by default, I mean no extra steps need to be taken by the end user to do it.
Once a server is installed, any additional systems should be able to just link to that one to get all basic installation and integration done. /home should be shared or cached.
OK, now that we have the supported packages taken care of, now we let users that have a need at the unsupported stuff. An intermediate repo with packages that have the ability to integrate with the directory, but that the user will have to follow a tutorial for. This lets users build skills in a manageable way.
Next you have the Universe repo. This is not enabled by default, but has everything, including the kitchen sink. But it's up to the end user to get it working.
Do that, and we have a system that would actually start taking share from, well, everyone.
Re: (Score:2)
Re:The short answer is yes... (Score:5, Insightful)
- Be fully integrated by default. All apps, all packages.
Do we get to sue the official distro for antitrust violations? What if my package isn't included? Do I fork it and release my own official distro?
- Be designed for the home user -> small business user.
Who is the home user / small business user? You've just described every use scenario other than a datacentre or standalone server. There's a big difference between someone using a system designed for productivity, vs gaming, vs creative design.
The distro should support a single NIC and a dual NIC design (dual NIC to isolate a useable network)
That doesn't sound like a home or small business user. Now you're talking about network expert / server territory. If your second NIC was a WiFi network you may have a point, but you've already broken your own rules. Just look at the number of home PCs or small business PCs that ship with multiple NICs, it's a very edge use case.
As I see it, a SAMBA4 domain would need to be at the center.
A domain is not the mainstay of home or small business users.
... SQL ... Apache ... Drupal ... SSO
You don't understand what a small business is do you?
Re: (Score:3)
I do agree Samba is the way
Have a reference distro that everyone forks off of (Score:3)
We can all go our own way, and we can all navigate relative to each other.
Anyone who is just starting out, or who doesn't want paid tech support, can install the reference distro,
If and when this fails to meet your needs, go shopping for a fork.
If something goes wrong, try it again with the reference distro before posting questions online. This way you can narrow down the source of the problem. The reference distro will end up getting
Re: (Score:3)
Tell us you've never maintained any linux systems, without telling us you've never maintained any linux systems.
Integration (Score:3)
Seamless integration is the secret sauce Windows and, to some extent, MacOS has.
When you log into a Windows server, you get a nice screen listing every service running, every feature, relevant logs and settings, all on a nice dashboard. Everything is designed to run against this dashboard. Even most third party servers will show up and place nice with it. It's only one example, but almost every service in Windows just works with each other. When you install Exchange, the relevant groups and permissions temp
Weird wish (Score:4, Insightful)
That is an intersting question: (Score:3)
However first we need to discuss the understanding of that term:
"Linux" is mostly refered too as being the kernel, and well there is the "mainline" (kernel.org) Linux Kernel.
But I think the post meant "Standard Linux" in the sense of at least a userland.
The term userland could be extended to a plethora of wayland, X, KDE, Gnome, Mate, and what not .. but that discussion would take away the focus, so I for my self will define Standard Linux now
define:
A Kernel compiled with default config with a basic minimal set of userland that enable a person or an automation system to build or install all other software needed, from there on.
And basically we have that with LFS:
https://www.linuxfromscratch.o... [linuxfromscratch.org]
However with the basic requirement of at first build the build environment. The advent of LFS was nearly revolutionary, but that is now forgotten.
My take is: whoever has gotten through the manual process of downloading all packages, building LFS, and operating it will gain a deeper understanding of the basic problem with the "basic Standard Linux Distribution" "b.S.L.D." along the different licenses the packages carry.
But I would like to take the time to have you take a look at the FreeBSD src tree (user land in this case)
https://github.com/freebsd/fre... [github.com]
What might interest you is the naming of the sub-directories named for example "cat", "cp", "dd", .. actually the names of your favourite userland tools that I think would define also "Linux" to you.
And when you enter one of such directories you will spot:
1.) make-file
2.) make-dependency-file
3.) a. ".1" file, which is the man-page for the tool
4.) a ".c" file
FreeBSD has every userland tool that for Linux is divided into many sub-src-"packages" - read LFS-book for a better term - inside this source tree.
And when you'd want to learn about the operation of a command you can just open it up in "vi", even edit it and just call make / make install when inside the corresponding directory and then you will have your own version of the corresponding tool
"How does vi-work" - look here:
https://github.com/freebsd/fre... [github.com]
btw. "freebsd-update" for example can also update the src-components, not only the binaries
And when you'd want to take a look at the package manager:
https://github.com/freebsd/fre... [github.com]
which handles everything aside from the kernel and basic userland.
Re: (Score:2)
The advent of LFS was nearly revolutionary, but that is now forgotten.
It was forgotten because it didn't work. You could follow the instructions exactly and the build would break, and if you weren't an expert already then you were definitely not going to be able to fix it.
But I would like to take the time to have you take a look at the FreeBSD src tree
We're talking about Linux, not FreeBSD. There are lots of reasons to run Linux over FreeBSD, and they do not bear rehashing here.
Re: (Score:2)
When I did a LFS system to "success" (could play QuakeWorld/Q2/Q3 online in 3d, watch all the common movie container formats, do Java coding for homework, use my fav mail client and browser, and print to our print server) way back in 1999/2000 and most of my issues were with getting the host system properly set up (used RH 6.0, something in 6.1 and 6.2 fscked it all up on building the first bootable environment)
Of course once I had it working, keeping up to date with all the needed updates/patches for secu
Isn't that Debian? Sort of? (Score:5, Interesting)
Let's face the facts: apt has won the package wars. rpm is an old school relic in terms of usage. And for apt Debian is the single point of truth.
So by and large, I personally basically consider Debian the "official Linux" with Ubuntu and Co. being its "corporate" offspring.
Declaring Debian the"official" Linux would be little more than a formality at this point. That's how I see it anyway.
Re: Isn't that Debian? Sort of? (Score:2)
I was going to say this.
Debian provides a great core.
If you want your own flavour, just add your own apt source.
Also being free, it is easy to create derivatives, it seems debian derivatives are quite popular.
Re: (Score:3)
both apt and rpm suck. I run Devuan but there is one important thing that rpm has that apt doesn't, and that's permissions repair.
SunOS 4's package manager could repair permissions in 1990. It's ludicrous that apt can't do it 33 years later.
obv troll, 2/10 (Score:2)
I'm sure plenty of /. readers have going into news groups back in the day pretending to be a noob and asked "what's the best distro?". There was always some sucker that would get all wound up answering that question but even then most people know it was a troll.
I won't even call it clickbait because this exact same troll has been going on long before the clicking started
The article even primes the reader for an enraged response by saying things like "cover your ears (or your eyes)...the open-source communi
Re: (Score:2)
Ok, yes, but what do you think is the best distro?
Re: (Score:2)
Late 90s when dialup and slow connections and such were common, as well as "well, ok I've installed Linux but how do I set up dialup?" type things, I always recommended "Whatever distribution comes with the big fat book you go out and buy"
Wasn't until 1999 (a year after I started working at a college) and our department got a CD burner that I started downloading distributions instead of buying a book with a CD or two in the back with whatever version/flavor I wanted. Tried checking them out of the library
Distribution Installer (Score:3, Insightful)
Re: (Score:3)
I think, that is a great idea.
yes and the answer is clear - document-based (Score:4, Interesting)
Linux From Scratch should be the official version of GNU/Linux, or something like it. A documented process for putting together a Linux is the right fit for every situation because it is the most adaptable and the most agile both in adding new features but also in moving to new versions of software. By LFS being the standard, that addresses multiple concerns. First we'd have a distro that we can also go to as our neutral model, while also being community driven. Second this does imply that vendors must work through an upstreaming process not only to get features into applications but also for the runtime environment.
Sstandardizing the runtime environment is helpful for application developers. They don't need one distro that is standard (despite what I suggested above) but instead for all distros in a particular sector to offer a level of compatibility with each other. This was partially achieved with LSB (Linux Standard Base) ISO/IEC 23360 in 2006 and 2021 [wikipedia.org]. It problematic in some areas, for example you have to ignore the package management format (subset of RPM) because essentially nobody outside of the embedded space complies with it in a full stack (distro + apps). Luckily that part isn't important and there isn't much push to fix it because it's not causing too many hardships. I think LSB mainly tries to nail down too much but also simple things like building apps to run on an old system is still complicated by how glibc and gcc work, and standardized versions would formalized what toolchain setup entails instead of the current ad hoc process.
App developers that care about binary compatibility go through some pretty painstaking work to put together a toolchain that correctly targets a runtime that is common among all the distros they wish to support. Typically a certain version of Ubuntu and a certain version of RHEL are simultaneous targets. Generally going back a little further than their respective EOL dates. For example, Ubuntu 16.04 has been EOL'd for over a year but I know some vendors still build binaries for it. They will drop them soon but until then they have held back on the library versions used in their internal toolchain since they don't use different builds for DEB vs RPM because who wants to QA a massive matrix of builds against multiple toolchains. This isn't to say .DEB and .RPM don't exist for these thirdparty developers, but the non-packaged version is the source of truth and the packaging itself is either done by the distro maintainer if there is some particular partnership, or by the original application developer is an additional release step.
In my opinion, the better approach versus what we currently do is for Red Hat, Canonical, etc to formally cooperate on industry standards. Either through reviving some subset of LSB or in forming a new standards body. I think I prefer the former, but this can be complicated by an unwillingness by some to redirect LSB into following new objectives for post 5.0 versions (plus it being nearly dead as a project). Possibly allowing continuing of LSB 5.0 into a series of minor 5.x updates and a new LSB 6.0 following new mission objectives focused on the areas I mentioned above.
Re: (Score:2)
Linux From Scratch should be the official version of GNU/Linux, or something like it.
Problem: Why aren't more people using Linux?
OrangeTide Solution: Let's adopt the single most difficult and cumbersome way of putting together Linux as the official solution.
Is your post a very big brained shill for Microsoft? Because you couldn't have possibly picked a worse solution to the question postulated in TFS.
Yet another microsoft (Score:3)
Re: (Score:2)
I will take this along further. If there is an "official desktop Linux", it will be a Microsoft Windows Clone. That is because that is what most people want, they want walled gardens and hand holding when it comes to tech.
And I am sad to say, Linux is already well on its way to being a Microsoft Clone. Between systemd, Wayland, freedesktop.org and WSL, many distros are just 1 step away from being a Windows/Apple Clone. Already we are seeing KDE and GNOME Stores, right now they are fairly open, but that
Need to standardise Linux servers first (Score:3)
Linux's big market share isn't in the desktop, it's in servers and embedded devices, where there's no traditional desktop UI at all. Hence, you have to standardise server/embedded distros first before you even look at desktops.
If you look at the three big server distro families - Debian, Red Hat and SuSE - the biggest difference tends to be the package manager. I don't even care which package manager is used - just make it the same one across all server/embedded distros. Once that's done, the next step is to standardise the installation locations and ABI compatibility across families (yes, LSB exists, but it's not adhered to). The eventual aim is that cross-family repos can be used interchangeably, but with a lot of money involved in server distros, I'm not sure the companies involved would be willing to co-operate to this level (the new SUSE/CIQ/Oracle initiative is a start, but is it enough?). I would also look to implementing BitTorrent for package downloads/updates (next paragraph explains why).
If all of the above comes true, then we have a "pick'n'mix" platform for desktop UIs, where the original desktop UI maintainers (Gnome, KDE, MATE, Cinammon, XFCE etc.) can build just one binary target per architecture they support and either host it themselves or have some independent central hoster carry it (I'm thinking the Linux Foundation or some equivalent). This is why BitTorrent should be used for package distribution - cuts down on hosting bandwidth costs. You can get many distro ISOs by offical BitTorrent links, so why aren't package downloads/updates using BitTorrent too?
As for distro ISOs, you could have a minimal one that gives the choice of text-only or, if you have a Net connection, downloads a list of pick'n'mix UIs available and lets you optionally choose one for download. A fuller ISO alternative would include one of those pick'n'mix UIs of course. In an ideal world, distros would try to standardise which UI is on that fuller ISO, but that's like herding cats at this point...
Once package management, install locations, ABI compatibility and - toughest of all - the desktop UI have been standardised, then both developers and OEM have a "standard" Linux they can develop and test against. This might happen on servers/embedded in my lifetime, but I'm not sure about the desktop...
Re: (Score:2)
Re: (Score:3)
We've probably already got to the point where all the major package managers are "good enough" and a new entrant is unlikely to supplant them. What's more likely to happen is new features (like BitTorrent support, which I'm quite surprised hasn't been implemented yet) will be added to existing implementations rather than a totally new package manager entering the scene and replacing one or more existing implementations.
Having said all that, one course could be a multi-format package manager that takes the c
At the risk of sounding like a douchecanoe (again) (Score:5, Insightful)
Let me put this as genteelly and carefully as I can.
The desire to make an "Official" linux, is a very bad desire. It should always be shouted down.
Please allow me to explain why.
The desire is intrinsically incompatible with the philosophy of open source software, and intrinsically incompatible with the *nix philosophy, of doing one thing, and doing it well. The standardization of *nix software is not at the OS level. It is at the application and interface level. This allows maximal flexibility for configuration and deployment, for specific use-case scenarios, which is precisely when, where, and how Linux deployments excel.
It is this conflict with philosophy and use-case that drives the conflict about system-d, snaps vs Flatpak (vs APT / package manager), and numerous others.
Slashdot is infamous for the dreaded car analogy, so here it is, pre-emptively.
"Should there be a standardized automobile? Imagine if all cars on the highway were exactly the same, with exactly the same parts, and nobody had to look up makes, model, year or manufacture-- and all vehicles came stock with automatic transmission! Getting your vehicle serviced would be so much easier!"
For basically all of the same reasons why with Linux, the answer is, and should always be a resounding NO, the answer to the above is and should always be a resounding NO.
There are different kinds of consumer vehicle for a reason, even though all of them (for the most part) are internal combustion engines, and operate on the same basic principles. Some people live out in rural environments, and greatly benefit from having manual transmissions. Having one, requires you to learn how to drive with one, and when and how one should shift gears for maximum utility to gain that advantage on those kinds of road surfaces. Enforcing this kind of ideology basically takes this kind of control away from people that have a legitimate need for it, and makes the world worse, not better.
The same is true in the computing world with bespoke servers that need to have very fine tuned performance parameters, in which small things like the logging system can have small but meaningful performance penalties (and when one might not want to have everything managed all at once by something like system-d, no matter how much the distro wants to force it.
The concept of "OFFICIAL!", implies that those situations that are not adherent to "WHAT WE SAY", are de-facto "Un-Official." This is wrong-headed, and foolish. That's like saying a manual transmission vehicle is "NON-OFFICIAL."
This then gets into the "God, I cannot believe how big of a pretentious asshole you are!" rhetoric starts.
Just like there are people that select for manual transmission vehicles, and do so for very specific reasons, which know about, and care about things like spark-advance, or fuel-mix ratios that most other drivers would not care about at all, (as long as the vehicle drives), there are people that profit, make use of, and are reliant on, the ability to take that level of control over a computer for such bespoke server applications.
I refer to the taking away, or obfuscation of such controls, as "Putting mittens on the user," because that is essentially what it is.
There are already operating systems that do this, and do so to a very shameful degree, such as Windows (which goes to great lengths to redact which programs you have associated with what tasks, just because microsoft REALLY wants you to use Edge, and not Chrome or Firefox, etc) or OSX (which goes to great lengths to prevent you from running it on non-blessed hardware.)
Both of those operating systems exist with the "Official!" mindset, and as a consequence, put the end user's hands inside mittens. (and then enforce the mittens with duct tape.) This makes them "Easier to use" for people that do not want to or have no need of such deep or fine control, but makes life very difficult indeed for the people that DO.
Linux is about the only remaining holdout fo
Sigh... (Score:2)
Didn't we have this argument 25 years ago?
And 20 years ago?
And 15 years ago?
And 10 years ago?
Finally, everyone wised up (or got bored) and stopped talking about it.
ZD is only writing about this since they need to fill "space" on their website.
There are TWO popular Linux standards already (Score:5, Funny)
Flame me, but why would people want Linux ?? (Score:2)
Re: (Score:2)
Flame me, but why would people want Linux to use when compared to Windows and Mac OS ? What can it do for 99% of general population with no computer skills ?
It offers superior security to Windows, with faster fixes for security problems. That alone should be reason enough. The threat level is perpetually rising. Windows is spyware and adware, but also just terrible at security. I wouldn't hesitate to install pretty much any current Linux distribution on a machine exposed to the open internet with no firewall. Would you even consider that with Windows? And if so, can I watch?
Let's create a distribution called Official (Score:2)
GNU/Linux (Score:3, Insightful)
And should there be an "official" version of Linux?
There is. It's published at kernel.org
Oh right. You're one of the people who don't get that "Linux" per se is only the kernel. What is commonly called "Linux" (but in all official documents always something like "SUSE Linux" or "Debian" or "RedHat Linux", etc.) is the set of the kernel plus whatever tools some distro wants to add.
You're confused by Windows being a monolithic mess of shit and think all other operating systems must be the same.
You are missing that Linux is strong exactly because it didn't go that route. My Debian servers don't have an X server installed because it's not needed. Unlike all the Windows servers out there that always come with a GUI for no good reason. Linux runs on anything from an ancient iPaq (anyone remember the age of PDAs?) to a supercomputer exactly because it didn't try to make a "one size fits all" attempt.
You don't like it? Put it on the list of other things you don't like, between lemon icecream and Mormons.
Now go away and leave the people actually contributing something alone.
Re: (Score:2)
My Debian servers don't have an X server installed because it's not needed. Unlike all the Windows servers out there that always come with a GUI for no good reason.
There's actually an option to run Windows Server without a GUI, you install it using the installation option to install as a Server Core rather than Server with Desktop Experience. But thanks for letting us know that thanks to your "m1c0$shaft" and "winblow$" linux loonie mentality what you know about Windows would fit on a postage stamp with room to spare.
Re: (Score:2)
> You're confused by Windows being a monolithic mess of shit and think all other operating systems must be the same.
>
> You are missing that Linux is strong exactly because it didn't go that route. My Debian servers don't have an X server installed because it's not needed.
Dude, tell us you have no clue without telling us. Windows is pretty much built the same way a typical Linux distribution is. A core set of libraries, a kernel, a GUI server and a bunch of utilities, both GUI and command line ins
I don't think the OP understands the problem (Score:2)
Well first of all, there is an "official" version of Linux, the standard kernel, but that's just the kernel, the userland is different. Here we have different solutions for different problems. An embedded system might for example boot of a RAM-disk to a very limited system, while for a system used by a developer you will want to have a writable root file system. Much of the widespread adoption of Linux is because it's so adaptable.
Then again, having an "official version" of an operating system has nothing t
Linux is the Kernel (Score:2)
What do you see when you walk in the computer shop (Score:2)
Each and every single computer being showed runs Windows, preferably the latest version.
Things is, Mr. Average is not going to care what his computer runs, as long as he can do what he wants on it.
This usually means:
- Facebook
- Bit of browsing
- Filling out taxes
- Maybe some gaming (no, most people are not interested in the latest AAA titles).
- Perhaps some other things, which are not that demanding.
Now, if we manage to convince the likes of MSI, Acer, and Lenovo to ditch Windows for Linux, the situation wil
There already IS one.... (Score:2)
It's called Debian. ;-)
Distributions choice don't matter much anymore (Score:2)
Things like SystemD, FreeDesktop, and Flatpak have unified the Linux distributions already. Network Manager functions the same way regardless if you are using RHEL or Pop OS.
The issue of a software developer freaking out over which Linux distribution to support has been solved by Flatpak (even Snap
Yeah (Score:2)
Oh yeah, I read that when it came out.
Who knew it was so simple? So simple! We'll solve the diversity problem by just picking one! SMH why didn't I think of that????
Linux Mainline is not good enough. (Score:2)
Re: (Score:3)
You can't use Linux Mainline and call it job done because the problem is the range of combinations of different desktop environments, graphical servers, window managers, sound servers etc piled on top of that that make it much more difficult to write software for than Mac OS or Windows. What it needs is a unanimous decision on using Wayland or X11, using Gnome, KDE, Cinnamon or Mate, using Arts or Pulsaudio, ALSA or OSS etc etc etc.
Sorry, we aren't going to turn it into Windows, and have no need to either. If you want simplicity, You can use Mint.That's as close as you are going to get to your demand of one and only one Software monoculture to rule them all.
And considering the versions and sub - versions of Windows that are still used today - y'all don't have much room to complain about Linux
Slackware (Score:2)
Again, with the snap and flatpaks... (Score:2)
AppImages are a little better in that they don't completely replace the underlying system's libraries and are self contained executables. (Much like Windows Portable Apps, you can run them off of a th
Hopefully not (Score:2)
Re: SystemD (Score:3)
Re: (Score:2)
Why is your startup sequencer concerned with your IP stack?
Re: (Score:2)