Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Open Source Operating Systems Linux

Linus Torvalds Receives IEEE Computer Pioneer Award 141

mikejuk (1801200) writes "Linus Torvalds, the 'man who invented Linux' is the 2014 recipient of the IEEE Computer Society's Computer Pioneer Award, '[f]or pioneering development of the Linux kernel using the open-source approach.' According to Wikipedia, Torvalds had wanted to call the kernel he developed Freax (a combination of 'free,' 'freak,' and the letter X to indicate that it is a Unix-like system), but his friend Ari Lemmke, who administered the FTP server it was first hosted for download, named Torvalds' directory linux. In some ways Git can be seen as his more important contribution — but as it dates from 2005 it is outside the remit of the IEEE Computer Pioneer award."
This discussion has been archived. No new comments can be posted.

Linus Torvalds Receives IEEE Computer Pioneer Award

Comments Filter:
  • by Anonymous Coward on Sunday May 04, 2014 @02:36PM (#46914137)

    Linus Torvalds did not "invent" Linux. He implemented a POSIX kernel, working from basic UNIX standards and preexisting hardware (the 80386 MMU). UNIX was an invention. Linux was "just" an implementation. As it grew, there were various inventions going into it. But Linux "as such" was not an invention.

    In contrast, Torvalds did basically invent Git. Its shape and functionality, as opposed to what Linux started with, were not predetermined.

  • by excelsior_gr ( 969383 ) on Sunday May 04, 2014 @02:57PM (#46914277)

    Not to mention that hadn't Torvalds developed the Linux kernel, we would still be waiting for the Hurd to take off. One could argue that Linux is binding resources (volunteer coders) that could be otherwise engaged into developing the Hurd had Linux not existed, but I simply doubt that developers would follow Stallman the way they follow Torvalds.

  • by Thomasje ( 709120 ) on Sunday May 04, 2014 @03:26PM (#46914459)

    I think you're greatly overstating the importance of Linux there. Not to take away from the great work Linus did and continues to do, but he himself said: "If 386BSD had been available when I started on Linux, Linux would probably never had happened."

    Source: http://gondwanaland.com/meta/h... [gondwanaland.com]

  • by Dutch Gun ( 899105 ) on Sunday May 04, 2014 @04:13PM (#46914793)

    just far less visibility.

    The Internet runs on Linux. The number of routers, firewalls/filters, and networking devices and network-connected appliances of all kinds that are Linux-based is staggering. Android is Linux. Every major commercial operating system has either learned/copped or borrowed code from Linux. The supercomputing world is totally pwned by Linux in every way. The practical work of virtually all of science these days relies on Linux.

    Linux is freaking HUGE for our world.

    On the desktop, however, Linux has been neglected, because designing consumer UX is a very different skill from the skillset that most of the OSS developer world brings to bear. It's too bad—when KDE 1.0 was released, it was obvious to anyone looking that Linux was the future of desktop computing—and yet in many ways the Linux desktop is worse than it has ever been from a consumer usability standpoint.

    But don't mistake "not visible on desktops at home or at work" from "not relevant."

    There's also a matter of sheer inertia in terms of consumer software availability. That's less of a concern with, say, internet infrastructure. Like it or not, DOS captured a large portion of the home and business market early, and Windows leveraged that success and built up a massive amount of inertia among home users. There was a critical period where commercial operating systems, for all their technical shortcomings, were vastly simpler to use than Linux was. I remember experimenting with Linux around '95 or so, and remembering it didn't compare all that favorably to Windows 95. To me, it seemed like it was really only a benefit for people who already knew and were comfortable with Unix, and wanted that environment for their PCs.

    Modern Linux desktops are pretty solid (better than Windows 8, certainly), but I'm not certain the real problem is usability. Windows runs nearly all computer games, most business software, and a massive assortment of other commercial products. For people who don't have particular Windows compatibility needs, they can choose the premium Mac hardware/software package, and it provides nearly everything a typical home user would want to start with, and is generally a bit friendlier to use than both Windows and Linux.

    That leaves Linux in an uncomfortably position on the desktop, which is unfortunate, because it's come so far and has a lot to offer. It just never got critical mass like DOS/Windows, or had the financial backing of companies like Apple to push it as an alternative OS with it's own ecosystem. At this point, for the average user, Linux really has little to offer them, other than being free and more secure.

    An OS's only real purpose in life is to run software. If the software you want to run is only available on Windows, then it's really only a question of whether the price is enough to drive users to another market (assuming no ideological reasons), and for a few hundred dollars spent every five years or so, the answer is pretty obvious. I think the reason for Linux's lackluster desktop adoption is probably as simple as that. And of course, the fact that its already small share is splintered into dozens of distros probably isn't doing it's overall adoption any favors, even if it's great for the enthusiasts.

  • by mrvan ( 973822 ) on Sunday May 04, 2014 @04:25PM (#46914879)

    I'll bite :-)

    I used csv and subversion back in the day, switched to hg, and now switched to git. I manage a smallish project with 5 or so contributors and contribute to some other projects.

    Git/hg vs csv/svn is all about distributed vs centralized. With git/hg, you learn to love branching and merging, and commit as often as needed.

    Git vs hg is more subtle, but I am strongly in the git camp now.

    In my perception, hg et al are about lines of code. You contribute code and the code is checked in. git is all about commits. Your work is in commits, and commits can be rebased, squashed, amended, etc until they are just right to express your contribution. Git is not so much about communicating with yourself about how you got to your code; git is about communicating to the rest of the team what you are contributing. In a sense, you are not (just) writing code, you are writing a commit history.

    That said, what I miss in git is the "version history" of commits. I would like to see some sort of "is-based-on" link between the 'final' commit and the commits it is amended, rebased, and/or squashed from. I would love to be able to 'expand' a final commit to see the history that went into it, because now you are sometimes choosing between commit elegance and keeping track of development history (aka in the choice to amend a silly type you choose elegance; in the choice to -no-ff merge a branch you choose history).

  • by Arker ( 91948 ) on Sunday May 04, 2014 @08:21PM (#46915821) Homepage
    You're showing SCO at $1295 for 'base' and that's in the right neighborhood, but you could not actually do anything useful with that. And the other x86 systems? Univel could offer their system for whatever price they wanted, it's an academic concern when your sales closely approximate 0. All of these systems were owned by companies that wanted maximum return on minimum investment and they were withering away from lack of development even before linux came along for the coup de grace.

    A/UX sounded great but it does not belong on this list because it did NOT run on x86 hardware, it ran on a narrow subset of the 68k architecture which was more expensive and much less common, it was never really well supported and Apple abandoned it completely in '95. I've only seen it running on a computer once in my life.

    "Many of the free and open tools, such as the GNU collection, could run on lots of the commercial releases as well."

    Of course, before Linux that was the only way to run them. But these were not x86 systems that individuals could afford - we are talking about Apollo and Sun and SGI and DEC machines, specialized high performance hardware that was priced accordingly. With few exceptions, people did not own these things - institutions did, and individuals were lucky to get a shell account that would allow them to compile.
  • by Arker ( 91948 ) on Sunday May 04, 2014 @10:17PM (#46916155) Homepage
    "All of those systems were commercially available at the time for the price indicated, so yes there was inexpensive PC Unix out there at the time."

    Fine, I can see how you think you are technically correct here, but this was true in name only. Those systems all sucked very badly, they were 'unix' by some definition but they were not acceptable substitutes for big iron unix in the way that linux quickly became.

    "As to "maximum return on minimum investment," why do you think people went after Linux?"

    Everyone wants maximum return on minimum investment, of course, but not everyone takes it to unworkable extremes. The other x86 unix vendors did. They got to call it unix by virtue of paying for a license and being authorised forks of the AT&T code, but never invested the resources necessary to get the whole system ported and working properly. Honestly, even SCO was not a passable substitute for proper Unix, it was so rough and full of holes that every day was an adventure, and the other vendors were even worse.

    "A/UX ran on hardware from what was the major competitor for X86 hardware"

    No, just no. 68k was an entirely different architecture, in a higher price bracket, running entirely different code and competing at a very different tier to the x86 hardware.

    "NextStep was also available for X86 at the time."

    Spoken like someone that never used it.

    I had the immense pleasure of working on a cube at about that time, side by side with HP/UX. Both ran on the big iron that us lowly mortals could not afford, and time-shares were precious. Yes, I know there was an x86 port before NeXT went caput, but how many people actually got a chance to see it run? And just how short was the supported hardware list, hmmm?

    Any of these systems, with some time and resources dedicated to them, could have provided a real unix on x86 experience. But none of them did. Not until Linux.
  • Exactly. (Score:5, Interesting)

    by aussersterne ( 212916 ) on Monday May 05, 2014 @01:10AM (#46916781) Homepage

    You can tell whether or not someone was actually there by whether or not they mention things like "Minix" in a list of viable operating systems.

    I was part of a project at the time that needed real networking and a real Unix development environment. We spent four months working to find an alternative, then shelled out for a series of early Sparc pizza boxes. SS2 boxes maybe? As I recall, we got four at nearly $15k each that ate up a huge chunk of our budget.

    Two years later, we had liquidated them and were doing all of the same stuff on Linux with cheap 486 boxes and commodity hardware, and using the GNU userland and toolchain. People here talk about GNU as predating Linux while forgetting that prior to Linux, the only place to run it was on your freaking Sparcstation (or equivalent—but certainly not under Minix), which already came with a vendor-supported userland. GNU starts to be interesting exactly when Linux becomes viable.

    All in all, the change was bizarrely cool and amazing. We were like kids in a candy store—computing was suddenly so cheap as to almost be free, rather than the single most expensive non-labor cost in a project.

It's a naive, domestic operating system without any breeding, but I think you'll be amused by its presumption.

Working...