Linux Is the Largest Software Development Project On the Planet: Greg K-H (cio.com) 178
sfcrazy writes: Greg Kroah-Hartmant, the Linux superstar, delivered a keynote at CoreOS Fest where he gave some impressive details on how massive is the Linux project. Kroah-Hartman said the latest release (4.5) made two months ago contains over 21 million lines of code. More impressive than the amount of code, and what truly makes Linux the world's largest software project is the fact that last year around 4,000 developers and at least 440 different companies that contributed to the kernel. Kroah-Hartman said, "It's the largest software development project ever, in the history of computing -- by the number of people using it, developing it, and now using it, and the number of companies involved. It's a huge number of people."
Superstar? (Score:5, Funny)
Uh, what? There are only two superstars in Linux: Linus and the guy who came up with systemd.
Re: (Score:3, Insightful)
"Greg Kroah-Hartmant, the Linux superstar,"
Uh, what? There are only two superstars in Linux: Linus and the guy who came up with systemd.
You mean there is one superstar, and one cackling super villain. I'll let you figure out which is which.
Re:Superstar? (Score:4, Interesting)
It turned out that the villain was Reiser
Re: (Score:2)
Re: (Score:3)
Who's "Linus"?
Re: (Score:2)
Re: Superstar? (Score:3, Funny)
You're going to Helsinki for that one.
Re: (Score:3, Informative)
You're right, "Greg Kroah-Hartmant" is not a Linux superstar, a person by that name doesn't exist. The correct spelling is "Greg Kroah-Hartman".
Re:Superstar? (Score:5, Funny)
> I downvote all systemd rants. Alas... I ran out already.
Well, you HAD more mod points, but they are in some binary log format, and good luck finding them.
Re:Superstar? (Score:5, Insightful)
Well, you HAD more mod points, but they are in some binary log format, and good luck finding them.
Sigh. I get it no one likes systemd, blah blah. But believe it or not as time has moved on, the binary logs are quite resistant and the format is fairly simple in nature. Likewise, most emergency boots now include journalctl. A simple journalctl --file some_log_file will allow you to browse ad-hoc any file you toss as it so long as it is uncompressed (which as an aside, if you compressed syslog you'd be no better at this point) Even the ones that journalctl --verify says are corrupt. The corruption most of the time post version 205 is that an index was not written. And even if that's not the case, you can force it to display what it can read.
I think the binary argument is a hollow argument at this point. The logs are pretty good at not becoming corrupt and tools are pretty much included now in most recovery tool sets. I see it as no different as say when a PostgreSQL database becomes corrupt. And if you really just hate the idea of binary, you can configure journald to use syslog, and no that doesn't require a recompile.
If you want my arguments of anti-systemd it would be the team that develops it is one of the worse teams to work with and the amount of scope creep is frightening. I think we can all agree that those two things pose the most headache to systemd than this notion of "oh the files are in binary, you boned brah!" Is systemd an ideal solution, nope. But as much as everyone tried, upstart and the like of init replacements were going nowhere fast. At some point, the haters are going to have to realize, they're still working on systemd and they are still making changes to make the tools better and the formats more resilient. Does that mean there will be 0 corruption, no. But you do have to realize that the project is still being actively maintained and they are addressing or have addressed many concerns the enterprise customers have had with it. There's real money being toss at the project to work things that people don't like out of it.
TL;DR - I'm not a huge fan of systemd mostly because of the crazy scope creep, but c'mon the binary argument is so behind us now.
Circular... (Score:4, Funny)
"by the number of people using it, developing it, and now using it" ...I was using it before BUT now, after developing it, I'm *really using it now... :)
wonderful? (Score:2)
At first, I think this must make it more robust to long term changes. It raises a follow up question, though. How do those developers/companies group up by contribution? I'm sure most are working on server/enterprise applications, but any changes there might be equally interesting.
For comparison, I found articles citing 1000-2000 developers for Windows 7. I had no luck finding estimates for windows server.
Only LUDDITES use Linux. (Score:1)
Modern app appers app apps on Appdows 10, not LUDDITE software on LUDDITE Linux!
Apps!
Re: (Score:2)
Would you like a chair to throw, Balmer?
21 million ain't all that big... (Score:3, Informative)
This is just the kernel (Score:5, Informative)
The VMS operating system was estimated to contain over 25 million lines of code, and that was measured over 10 years ago - I'm sure it's quite a bit more by now.
This is just the kernel. But most of it is arguably "not" kernel code... it's drivers. This is directly addressed in TFA:
All versions of VMS and OpenVMS together come nowhere near to running on as many different hardware platforms as Linux, so it would be shocking if Linux's drivers weren't massive in comparison.
Re: (Score:3)
Re:21 million ain't all that big... (Score:4, Funny)
Yeah and you'd never know it... SAP being such a lightweight and straightforward product...
There is one project that is larger than Linux (Score:2)
Re: (Score:3)
Taking the numbers at face value (Score:5, Interesting)
Taking the numbers at face value you get the following stats:
- with 4000 developers
- 2.7 lines of code added per day per developer
- 1.3 lines of code removed per day per developer
- 0.47 lines of code changed per day per developer
Re: (Score:3, Insightful)
that's about right for mature software predominated by small bug fixes, which take a long time to track down and verify
Re: (Score:3)
Taking the numbers at face value you get the following stats:
- with 4000 developers
- 2.7 lines of code added per day per developer
- 1.3 lines of code removed per day per developer
- 0.47 lines of code changed per day per developer
Well he didn't say anything about FTEs. If I had been a little bit quicker on the draw once, I might have had a one-liner patch in the kernel because a -rc1 happened to kernel panic on my particular graphics card because a device descriptor string was missing. By the time I'd figured it out, found the right devel-list and made a patch, the same fix had just been posted and approved for -rc2. I was actually a bit sad I missed it, just for the nerd points. So don't expect all 4000 to be people ordinarily work
Re: (Score:2, Informative)
Averages really don't work for that kind of project. My contribution to the Linux kernel: I found a bug and it was an easy fix, so I submitted a patch. I changed 8 lines of code in a driver. Haven't contributed before, and haven't since. If your name and email address shows up in the kernel change log, you may get an email from Greg asking in what capacity your contribution was made. The data is compiled into reports [linuxfoundation.org] that give more details than the article. For example, about one third of the contributors,
Well, tell that to the Scandinavian Gov. (Score:3)
In fact - BankID which is the most used login method just came out and said they'll drop Linux (which they did) support as the userbase was too small.
I also use Telia ID card login - with a card reader, even this is "hackishly" supported (officially unsupported, but I've gotten it to sorta kinda work), so people with Linux can't officially even register at the unemployment offices unless they have windows or a smartphone.
But hey, hopefully we're heading in the right direction - I've been a registered Linux user since 1998, and now exclusively use Linux for my Desktop. Every time I go to work and use Windows - I'm constantly reminded of an inferior system with endless updates, endless disk-trashing and endless limitation as a user.
Re: (Score:2)
Re: (Score:2)
endless limitation as a user.
I'm really curious, care to give some examples that one cannot do on Windows 10 for example?
Re: (Score:2)
Can you configure a NIC into vlan trunk mode and use multiple tagged vlans yet (and completely block untagged vlan)? That's something I've banged my head against the desk over many times with Windows XP and 7. Haven't tried on anything higher yet. I've just given up on using tagged vlans on Windows.
Re: (Score:2)
Re: (Score:2)
No, that only allow you to set a single vlan tag onto an interface. Big whoop. In that case, you may as well just set the vlan ID for the switch port you are plugged into. At least that will work with any and every NIC out there (and is completely OS agnostic).
But, that's not what I asked for. I want to know how to assign multiple vlan tags to a single NIC in Windows. I have yet to find a way to do that.
My Linux station at work right now has 3 vlans tagged onto the single physical NIC, allowing my stat
Re: (Score:2)
Re: (Score:2)
Which requires a specific server NIC (and 10 Gbps or faster at that) and a server version of Windows. We're talking about desktop Windows and Linux here.
Basically, you can't do multiple tagged vlans on an interface with desktop Windows, which is something that can be easily done with desktop Linux.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
It died briefly, but got resurrected again, still - I registered back in 1998 when the page was: counter.li.org
Re: (Score:2)
Interesting, you're implying you're one of them? Well - every time you decide to ditch Linux support - it really is a nail in the coffin for the company's future.
Need a better metric (Score:2)
Funny how an entire printout fits in a briefcase (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
Re:The greatest software project on Earth (Score:5, Informative)
I nearly fell out of my chair when I read your LKML post claiming that the 2.2.x/2.4.x kernels were more stable than they are now. I'm guessing you either weren't involved with Linux back then or have a short memory.
Here is a short history lesson:
The development ended up dragging out to the point where the base kernel features needed for new devices not existing.. so you could use the stable kernel which didn't support much, or you could try the development kernel which tended to be wildly unstable or you could use the disto kernels (RedHat etc) which attempted to patch drivers from the unstable branch into the stable branch resulting in something almost as unstable as the development branch.
I still have nightmares of the day someone presented me with a brand new IBM server in which the Old kernel couldn't see the disks, the dev kernel crashed on boot, and the distro kernel would crash several minutes after starting. (I ended up going with a hand built custom kernel + patches).
It doesn't matter how many times you whine on slashdot or troll the kernel list. No one who remembers the old stable/unstable branches wants to go back to that nightmare.
Also FYI: There are several projects that QA the Linux kernel testing branches and report the results back to the kernel devs.
Re: (Score:2)
...Said the troll who has six or seven different versions of MSVC++ Runtime installed alongside one another. And apparently twelve different versions of Not-OpenGL over the last 20 years is perfectly okay.
Re: (Score:2)
And apparently twelve different versions of Not-OpenGL over the last 20 years is perfectly okay.
It's either eleven versions if you count major releases, or 16 if you also count point releases...
Re: (Score:2)
And yet it's eaten everyone's lunch with it's informal but successful QA. Just because it doesn't have a manager reading up on the latest fad and implementing it everywhere, badly, doesn't mean it isn't getting the job done.
Re: (Score:2)
Re: (Score:2)
In the case of Linux, you don't have to argue whether unit tests or QA/QC are useless. What matters is simple: what developers want to spend their time contributing. They have historically not done unit tests and the QA/QC is probably as varied as the developer. Linux is a community effort by a self selected community. It's not a corporate driven, profit-seeking entity with a singular management structure.
Re:The greatest software project on Earth (Score:5, Interesting)
This is not entirely true when the software in question is being directly converted into hardware. This happens for VHDL, Verilog, and SystemVerilog. People call these hardware design languages, but the reality is that they are pure software. Other tools (Synopsis Design Compiler for example) turn this high-level software into low level netlists which in turn are used to produce silicon. A mistake in the high level software can cost millions of dollars to fix and spin a new piece of silicon.
The definition of 'unit', however, is not necessarily always agreed upon. What is the 'smallest possible' unit that can be tested? I would argue that it is the smallest possible unit that makes sense to test. For example, a PCI-Express root-port has at least one very well defined interface (the actual PCIe bus), but often also has very well defined internal-interfaces for how the root-port 'unit' plugs into the rest of the chip. Since all interfaces are very well defined, I can write unit level tests to fully exercise and prove that the root-port works correctly, and there is very little need to write full-chip tests which would end up taking days to run vs the hour or two it takes to run against just the unit.
The full-system tests are much harder to develop as they require writing assembler to get the CPUs to send traffic down to PCIe. But instead I can directly send write/read commands which is much simpler test case to develop. IT also gives me finer grain control of my test and the timings, as I don't have nonsense like cache-hits, misses, etc getting in the way and perturbing my stimulus timings.
The logic portion of developing hardware is almost entirely a software problem these days, with the exception that this software has to meet some real world constraints of physical setup and hold timings. But software for real-time embedded systems have similar timing requirements that must be met, so it's not a completely foreign concept to software development.
Unit testing is critical to certain sectors of software development, and large corporations spend hundreds of millions of dollars a year doing unit testing because in the long run it saves a ton of money and time.
Re: (Score:3)
Unit testing of software is a great way to prove that a given software routine performs the wrong function with perfection.
Re: (Score:2)
Re:The greatest software project on Earth (Score:4, Insightful)
In electrical engineering circles we call these two aspects of testing 'verification' and 'validation'. Verification is verifying that the code matches the specification. Validation is making sure that the specification is really what we want in the first place.
Fortunately, us real engineers demand complete specifications, and when we find mistakes in those specifications we make sure they get fixed. Doing things in your haphazard land of writing code by the seat of your pants must be stressful at times.
Re: (Score:2, Insightful)
I have never, and will never, work for the government. Real engineers use specifications and test against them.
Re: (Score:3)
Re:The greatest software project on Earth (Score:4, Informative)
I think you confuse "the art of programming" with "the craft of programming." If it were the former, you'd be allowed to put the noses on sideways, like Picasso, and call it "art". But as a craft, you need to produce the best work possible, even though it certainly contains an element of art.
I work on safety-critical software. We are required to do rigorous unit testing on every line of code by The Authorities. And I regularly find bugs in my code in the process. In fact, since I will be developing unit tests anyway, I (try to) make it a practice to do develop a module and its test simultaneously, which is more efficient than doing the unit test at the end, as is the common instinct by those who regard unit tests as "useless."
I also think you confuse "useless" with "efficient." Depending on the use of the software, unit tests may be an inefficient use of development time - as the Linux folks evidently believe. So, I don't do unit tests for every category of software. For example, there's no need to do that for a simple utility script. But I wouldn't want to fly in an airplane whose autopilot code was never unit-tested, and whose developers instead simply assumed that all the code they write is bug-free.
Re: (Score:2)
Re: (Score:2)
In the safety-critical world I work in, we have to do both unit tests and what we call "system tests," aka black-box tests.
As an analogy, you could test test that a mechanical clock "works" because it tells time accurately, but if you really need to be certain that it will be reliable under all conditions, you're also obligated to individually test all of its gears and other various parts.
I'm glad the people who write code for things like airplanes and medical devices do that, even if the Linux folks don't.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
You know its just about the basic things. For example I've heard that there are driver regressions with features from certain devices stopping to work. How hard is it to wire up some computer to a test rig that pulls each day from mainline, builds it, and then tests the resulting kernel on various devices, testing that all features of the devices still work. Then it would make it easy to find the bug as you only have to sift through a day's commits.
No, unit testing is no silver bullet, but a type system isn
Re:The greatest software project on Earth (Score:5, Interesting)
That is because unit tests are useless and so is most QA/QC.
In your first sentence you demonstrate your total and complete ignorance of the subject matter at hand.
Let me guess, you think Agile development improves quality too.
No, it improves the productivity of shitty prima donna developers who think they can spend 8 years writing 4 lines of code to get it just right. It allows middle managers to reel in asshole developers who think their shit doesn't stink while actually producing nothing of use to the business.
Its a really shitty development pattern for a good developer, but 99.99999% of developers are shitty and Agile is far better than free roam for them. Your average run of the mill developer in most software shops is actually less useful and efficient than most janitors.
For every Linus there are 7 billion people who aren't.
By the time you have defined all your "unit tests" we have already built the entire system and have it running in the real world.
Of course mine will work and do what its supposed to do, yours will be buggy, not meet the design requirements, not function as expected in random ways.
You are the definition of ignorant on this matter. You really should just keep your opinion to yourself because the only people who are going to agree with you are going to be the same people in the unemployment line with you.
Software development is an engineering practice when done right, which you clearly have no fucking clue how real engineers work. Fortunately people like you will never be allowed to do anything that actually matters like build bridges or buildings, and your ignorance and arrogance will prevent you from even understanding why you'll never be able to do it.
So congratulations there snarky ... you've shown us you're nothing more than a completely ignorant, smug asshole who has no fucking clue when to shut his stupid mouth :)
Re: (Score:3)
That bit about shitty programmers...that's gold. But only a corporation would have to resort to such methods to rein in developers. In Linux land, if you fu is weak then your code goes nowhere (hopefully, at least).
Re: (Score:3)
Re:The greatest software project on Earth (Score:5, Informative)
The problem being is that many managers *think* that some magical process can allow shitty programmers to do great work, and then lay off the great programmers to get cheap-o developers because Agile means quality, no matter how bad your workforce is.
Of course this isn't how most of the well-liked software is done, but it's done *all the damn time* in enterprise software land. Leading to a paradox of free and cheap 'consumer' software generally being a *ton* better than equivalent enterprise software packages which generally cost a whole lot more (when equivalents exist).
Re: (Score:3, Insightful)
Re:The greatest software project on Earth (Score:4, Insightful)
I've been writing software for decades
I'm the new RPGLE programmer in an AS400 shop and I'm surrounded by developers like you. Long story short, the 20+ year old code base is crap, none of the neck breads know what the hell is going on in the system and then blame it on bad data. Everything in the system from a user's perspective is confusing. And after a few years of using the system these folks wrote, our clients scream for SAP or RedPrairie. And these folks keep telling themselves, "They don't know what they're missing! We can roll out changes in days not months! They'll be sorry!" All the while I sit there and see our company bleeding money out because people like you think you can do no wrong and are artist and the code you write is somehow magical.
I'm not saying unit test and agile development fix bad programmers, but what I am saying is the free roam model of programming leads to a lot of crap and loss of money. And the self righteous nature means they'll never understand why our clients go elsewhere and never return. It just is so frustrating to work with people like you because you all believe that you can do no wrong. People want software that works. Sometimes that means unit testing, sometimes it doesn't, but if just go around yelling, code is art and the users exist only to ruin it! You're just blowing your company's money out the window. Just bloody hell, and you wonder why they keep outsourcing? Well just look in the mirror one day, there you go. Please never work within a 500 mile radius of Atlanta.
I'll take my troll mod now. For this person it was worth it. And trust me bud, I know that everything I've said has fallen on deaf ears.
Re: (Score:2)
Re: (Score:2)
Re:The greatest software project on Earth (Score:4, Insightful)
You have a way over optimistic view of Agile as generally implemented.
I contribute to a project that has CI going to run unit tests on every commit. Those unit tests take hours to complete. I have not seen those unit tests catch a single bug (other than style formatting issues, which are non-functional). The human testers have. The problem being that people equate hours of test as the sole metric of quality. The quantity of test does not by itself assure quality of test. The other facet of this is the phrase '100% coverage', which I've seen *many* project managers jump up and down all excited. I have seen QA teams *fired* at that milestone, because it's a misleading phrase. When management sees '100% automated test coverage' they think 'Human QA is no longer needed'. They don't understand those unit tests are just as likely to have problems, and almost certainly will not be all-encompassing (in fact unit tests tend to hit the corner cases that the developer *recognized* would be tricky, and had those edge cases in mind as they wrote the code anyway)..
My problem is less that unit tests are a waste of time or that they can't be good, it's that too many advocates of the process oversell the benefit and lead to poor decisions with respect to quality based on overconfidence in the process. The conundrum is that in practice, either you oversell it and cause these bad consequences, or you try to describe it accurately and project management will express frustration on 'wasted effort' in writing unit tests since they can't really replace human testing.
Incidentally, the sentiment that the project burdened by unit tests will be beaten to market is a likely consequence. It's not necessarily a *good* thing, but if 'good enough' hits the market before 'truly good', most will ignore 'truly good'.
Re: (Score:2)
Then those are low-signal tests and you should stop running them...
Re: (Score:2)
I'm not disputing that there is something wrong with that scenario. I'm just frustrated that the project *thinks* those tests alleviate need for human testing. This is not an anomaly, in most places I see unit tests prominently on display I see the pattern of making too much out of the unit tests.
Re: (Score:2)
The CI jobs on this thing take hours. I'm not going to run the checks on my system before letting the system take a crack at them. It's not like my push is even a commit to master, it's staged in a code review for humans to have a chance to look it over while the CI tests are running. Why in the world would I delay getting human feedback in code review to make sure I'd pass unit tests first, when the only potential downside would be it catching a bug in public?
Re:The greatest software project on Earth (Score:5, Interesting)
Software development is an engineering practice when done right, which you clearly have no fucking clue how real engineers work. Fortunately people like you will never be allowed to do anything that actually matters like build bridges or buildings
I just had to laugh at this one. You praise Agile but it is everything but the way you'd build a bridge or building. You absolutely don't create a building one room at a time, you have architects and construction engineers and blueprints all ready to go before you start implementing. If you did it the Agile way you'd rewire the house ten times over before you're done. And you don't in the middle of construction find out you want to add another floor and an upstairs bathroom, not unless you want a huge replanning. Agile is more like Star Trek, full power to shields/weapons/engines/life support and things magically rewire themselves to serve the most pressing business need.
To be honest, I feel like waterfall overplan and overthinks things, Agile underplans and underthinks things. I like waterfall projects that act "agilish", small initial scope and iterative releases, clear deliverables, priorities and rapid prototyping. Or Agile projects that act "waterfallish", someone with a clear long term vision of what the system eventually will look like and how the components we build will scale and fit the big picture. Bad waterfall is just mental masturbation over plans that'll never work in reality. Bad Agile is just timeboxed cowboy coding, making up shit as you go along. The problem is some project managers think you can make uncertainty go away by sitting around a table and discussing it further. Some things you just won't know until you've tried and seen what kind of progress you actually make.
Re:The greatest software project on Earth (Score:5, Insightful)
Re: (Score:3)
You're making one big mistake, and that's equating software development with construction. It's not. Software has achieved automation of construction of final artifacts based upon detailed specs: it's called compilation. What programmers do is much closer to creating specs than to construction according to specs. People writing recipes are not cooking.
No. In the waterfall model, first you write the spec then you implement the spec. The developer breaks down the spec to code, the same way the compiler breaks down code to machine instructions. Implementing the spec is not supposed to be a creative process, in fact figuring it out everything you're going to do first so you know how to best implement it and don't go down dead ends and waste time rewriting, refactoring or switching tools/libraries/architecture is pretty much the cornerstone of waterfall metho
Re: (Score:2)
So, your advice is to put the best guys in straitjackets so you can keep the idiots you hired from poking themselves in the eye with a fork when they go to lunch?
Linus chose to let the best guys do their best work and if anyopne comes back from lunch with a fork in their eye, send them to the doctor and invite them not to return.
Re: (Score:2)
Re:The greatest software project on Earth (Score:4, Insightful)
Of course mine will work and do what its supposed to do, yours will be buggy, not meet the design requirements, not function as expected in random ways.
Exactly true. I've found over many years of engineering that there is a vast difference between something that nominally works and something that works perfectly under all conditions. Once you've achieved the former, you've done perhaps 1/3 or less of the work required to get to the latter.
Re: (Score:3)
Re: (Score:3)
Maybe I'm not as smart as you if it took me longer years to figure it out. Even so, based on our figures, it sounds like it takes you about three times as long to get something working well after the point that it nominally works. That might explained why you noticed this effect almost immediately. ;-)
Re: (Score:2, Insightful)
That is because unit tests are useless and so is most QA/QC. Let me guess, you think Agile development improves quality too. By the time you have defined all your "unit tests" we have already built the entire system and have it running in the real world.
Only if you think nine 5's is acceptable reliability.
Some of us have higher standards than that.
Because when you build systems of systems, it doesn't take many constituent systems built with the attitude that "unit tests are useless and so is most QA/QC" to turn the entire assemblage into a steaming pile of unreliable shit.
But hey, if you like producing steaming piles of unreliable shit, go right ahead.
Just get of your damn high horse - because it's stinking up everything with horse shit.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Without comment on the "merits" of all these arguments, I can make the interesting observation that Linux haters seem just as rabid as Microsoft haters.
All of which, ultimately, is unnecessary. If you hate Linux and love Microsoft, run Windows. If you love Linux and hate Microsoft, run Linux. There's no need to "prove" that one or the other is "better." Run whatever works for you. Heck, run Apple stuff if you want.
The point is: what you do as an individual doesn't affect me much, if at all. And vice-versa.
N
Re: (Score:2)
Re: (Score:2)
The core parts of Windows don't behave like that
As someone who uses a Windows desktop a lot, Windows is no saint when it comes to performance going to shit. My Linux system has not gone off to unbearable performance garbage under my particular workload.
Different platforms are currently weak to different things. For you, Linux had a key weakness for your workload. For me, Windows has some weaknesses. The difference being my ability to actually characterize what the OS is doing under Windows is much reduced.
Re: (Score:2)
I read your link. The part I find surprising is that the author finds it surprising that modifying mmap-ed memory might involve file I/O. Why wouldn't it? So the answer is to NOT safeguard from data loss by letting it hang around uncommitted forever? I can make it WAY faster, I'll just make all calls that write to a file be a NOP, that'll be really fast as long as you don't care about your data.
The solution is to use the APIs correctly. Either put the file in tmpfs or use SYSV shared memory.
TL;DR: When you
Re: (Score:2)
The problem is that a lot of efforts bearing the banner of 'QA/QC' are not really that good, and they embolden bad decisions. At the core of this is a commonly held belief that 100% coverage with unit tests == quality job done. If you just have unit tests, the system and user experience will inherently be taken care of. This phenomenon has led a lot of justified folks to get pissed at the concept of unit tests altogether, even though the practice isn't really bad in and of itself. Some people find it a
idiots abound on slashdot (Score:1)
My impression is
WRONG
it has many fewer users
a billion android devices will disagree with you
Re: (Score:2)
Android anyone? Not to mention all the embedded variants.
My thinking is that Linux has at least as many users as Windows, if not more.
Re:The greatest software project on Earth (Score:4, Insightful)
I notice you didn't include things like servers, routers, automobiles, industrial machinery, televisions (esp. the "smart" TVs), all the IoT shit (e.g. thermostats), avionics systems, and etc.
The embedded stuff outnumbers smartphones and 'PCs' by at least a factor or two... and nearly 50-60% of those run some embedded variant of Linux... embedded Windows ekes out maybe a sliver of that market.
Re: (Score:2)
and nearly 50-60% of those run some embedded variant of Linux
Have some numbers to back up that?
Talking out of my ass, I'd bet that more than 90% of embedded devices produced since '00 have no OS at all, and if you count only the segment of non phone stuff with an OS, more than 50% of that wouldn't run linux but some variant of RTOS.
Re: (Score:2)
I notice you didn't include things like servers, routers
You mean "Linux has its embedded home in small-form-factor PCs, such as home routers and set-top boxes" doesn't include routers?
If we want to bring servers into the mix, can we talk about how there are fewer servers than *people* involved with any given business, and how we're reducing the number of actual server devices by doing combined services in "The Cloud"? I.e. you need 1/10 a server, I need 1/10 a server, they all need 1/10 a server, let's all rent 1 server from some other guy. Meanwhile, at th
Re: (Score:1)
I don't see anyone making that assumption. Can you quote the specific claim that you're reacting to?
Re: (Score:2)
Bigger projects means more competing requests. Which means more sacrifices to meet all the requests.
Small apps that do what you want is far better than a big one.
Re:Put on some fresh pants already (Score:5, Insightful)
I'd say having a university programming project become one of the biggest operating systems in history, and with the vast number of contributors and collaborators, all for a project that you can freely download, yeah, that's a pretty impressive badge of honor.
If Linus died tomorrow, he'd be viewed as one of the pantheon of great computer innovators, not so much because he produced anything in and of itself unique, but rather because he transformed the *nix ecosystem, and lead to greater penetration than I suspect Unix's original creators could ever have imagined.
Re: (Score:3)
Re: (Score:2)
> Kroah-Hartman said the latest release (4.5) made two months ago contains over 21 million lines of code.
Yeah, I found that 21 millions of code strange too.
Microsofts claims [wikipedia.org] Windows XP has ~45 million lines of code.
Re: (Score:2)
Wikipedia isn't a software project. It's a data project. By that measure, it probably isn't even the largest of those, except by contributer numbers possibly (but then you could say that Facebook was the same kind of thing - people contributing to a collection of data).
MediaWiki is the software project behind Wikipedia. And it has nowhere near the same number of contributors, testers, etc. If we stretch the definition so that every possible user of it, etc. comes into the statistics, than you can easily