Forgot your password?
Linux Software

Interview: a New Linux Year with Jon 'maddog' Hall 70

Posted by Roblimo
from the onward-and-upward-as-always dept.
Jon "maddog" Hall is the head honcho at Linux International. Monday we asked him questions about the future of Linux, his beer preferences, and other burning issues. Today, as our premier interview guest in the last year of the 20th century, maddog answers with his usual wit and candor.

1) What NEW directions do you see Linux going in?
by coreman

There have been lots of articles on what is the future of the current Linux projects... What do you see as the NEW, non-current directions that Linux will embark on in the near future/next century?


Holy cow!! Linux is already being used in ubiquitous computing, embedded computing, turnkey systems, desktop systems, server systems, and Beowulf many more directions do you think it should turn?

True, I have been vocal about not telling people to use linux as a "highly available" system, due to its current lack of things like a log-based file system, hot-swap hardware, better hardware error detection and other things that a highly available system would need for a 366 day a year uptime.

And true, many people have noted a lack of NUMA memory capabilities, allowing expansion of processors past the single ultra-high-speed memory bus limitations of cost vs performance.

And a lot of different countries would LOVE to have true internationalization and localization done, so just by changing a message catalog (or adding to it) an operating system or application could be localized for a particular culture.

And then there is the perennial lack of applications, device drivers, etc to make Linux a truly viable desktop.

All of these are bad news, but the good news is that each of them is being addressed at "Linux speed".

In the past, and as short a time as ten years ago, these were all seen as lacking in the commercial Unix operating system space. I heard these same issues as reasons to use systems such as MPE, MVS and VMS, and that Unix systems "could never" do these things. Now most of these needs can be met through these same commercial Unix systems. The Linux community can now take the best of these solutions and designs for this work and re-implement them to meet the needs of the Linux community.

Since visualization of solutions is the hardest thing, and architecting a solution is the second hardest thing, if we pick the best implementation and re-implement it, we should have solutions to these (and other) issues very quickly.

2) Linux vs. HURD
by Gurlia

Do you think GNU/HURD might one day take over Linux's place? It certainly has a more modern design, although it is currently still in the works. Do you think it's a plausible alternative to Linux when it is ready for general consumption?

Or does Linux have a drive in the Open Source community that HURD doesn't? Linux seems to have generated a lot of enthusiasm, fandom, (and zealotry?). Could it be this drive that made Linux so successful and the lack thereof make HURD take such a long time to get developed?

(Disclaimer: I am NOT trying to start a flamewar between Linux and HURD supporters.)


I think that if the technical goodness of a kernel was the defining issue, then one of the *BSDs would certainly be where Linux is today.

It is hard to say why Linux caught on, and the *BSDs did not, but I definitely think it has to do with the community spirit that the Linux community has managed to garner.

I have known of the FSF for years. I fought to have free software delivered with Digital's Unix systems since 1984. I have tried to donate equipment to the FSF. Unfortunately for reasons dealing with Richard Stallman's ideals of Free Software, and Digital's inability to create a licensing agreement (in those days) that he could sign, I was limited in what I could do. After a while, it was JUST TOO HARD.

With the Linux community I have never run into an issue yet that was JUST TOO HARD. Most of the time the people agree with what has to be done, and that their goals and ideals are much the same as mine.

I will also say that I am REALLY GLAD that Richard Stallman is the way he is, and has the ideals he has, because he continues to show me the path that I SHOULD be going, even though I may only get halfway there. Without his lead, I would have no epitome to reach.

Now, will HURD take over from where Linux is when HURD is ready? Perhaps. If it really is a better-performing kernel than Linux. Remember that a "more modern design" does not guarantee better performance, or even easier maintenance.

I have not looked at the HURD's design, but I understand that it is based on a microkernel, and from this you may get the idea that it is a "more modern design". But OSF's code was based on a microkernel, and so was MkLinux and a variety of other systems that have come out. None of these have shown a performance improvement over what can be done with a well-structured monolithic kernel with loadable device drivers and loadable kernel modules.

Since my customers sort of sit around with stop watches measuring performance in SPEC-marks and SPEC-rates, TPC numbers, etc., it is hard to give up performance for other issues.

Still, if the HURD comes out, and if it is system-call compatible with Linux, there would be a good case for substituting the HURD kernel for the Linux kernel. I make the one stipulation because I think that ISVs are TIRED of porting applications from one system to another, and really want to see ONE binary interface from the Linux/HURD community for each hardware architecture. This is why I think that one of the most important aspects of the Linux community is the Linux Standard Base (LSB) project.

3) Chasing the taillights?
by Wiktor Kochanowski

Linux, and in general the Open Source development model, has been accused in the past of "chasing the taillights" -- of always catching up to features that other commercial programs have, because they are results of vision rather than of a creeping evolution.

Myself, I think there may be something in this view, when I look e.g. at the emerging UI input methods like voice recognition and pen input/handwriting recognition on the client side, and various goodies on the server side.

Do you agree with this? If so, is Linux condemned to always be a few steps behind of the current state of the art of OS design, at least as far as features go?

If not, what examples of vision and features unique to Linux would you provide as examples?


You are talking to someone who has been in the computer field for thirty years, so for the most part all I have seen is "chasing the tailights". Sure there have been some innovations in networking, but most of the operating systems have taken a lot of their "innovation" from systems such as the Whirlwind, SAIL, Xerox STAR and "re-implimented" them.

[I am sure I pissed off a lot of people with that last statement, but I purposely made it that way to get people to really take a look at what these older systems have done, and to marvel at what they did so long ago.]

I think that a lot of the features of an "operating system" you are talking about (i.e. input/handwriting, voice recognition, etc.) are things that were developed as layered product projects, and "integrated" into a certain operating system by a certain company we all know as part of "THAT COMPANY's innovations" (DON'T GET ME STARTED ON THAT TOPIC).

As the Linux worldwide market grows, I think you will find that more and more of this innovation will happen on Linux, due to both the Open Source concept and the worldwide virtual groups of minds who will work on it. The difference will be that the Open Source model will show where the innovation actually came from, and not where it was bought from.

4) certification
by Zurk

Regarding the recent community linux certification efforts etc, can we expect to see LI take a part in this? Are we going to get free community certification for Linux? Especially since all PHB's now seem to want certification...



It was members of LI that started the LSB effort, and a lot of our members are very active in the pursuit of this standard.

LI members encouraged both Sair and the LPI in their standards efforts, feeling that (particularly in the early stages) two or more open certification efforts would be useful, since the community would decide (in the end) which of the certification efforts was best. The voting on this would be by how many people signed up for that certification, and which certification was judged best by the hiring managers of the certified people.

In the case of hardware certification, LI has been encouraging an emerging distribution and hardware manufacturer neutral certification group which has the goal of determining what the steps should be for certification rather than any set rules of certification itself.

I do not believe that there will be a "Free Certification" simply because there is a lot of boring, mundane work in marking answer forms, administering certificates, etc, but it can be made as inexpensive as possible by making it open, with openly published standards that have to be met.

5) Linux feature growth
by ajs

As I mentioned in a recent article thread, the Linux kernel is braving new waters in several areas which UNIX has traditionally shunned in the kernel (graphics support, http server, game support for network management, etc). These features raise the eyebrows of many people, but is this the way you see operating system design moving in the future? Are we so bound by the dreaded user-mode context switch that we have to plow every service as deep into the kernel as it will go?

Mind you, I'm all for the khttpd idea as a single example, but it seems like the beginning of a trend that will end up making the original Linux kernel look like a wristwatch driver, and leave a lot of low-end users in a bind....


I still remember the time we placed the X window server inside the Ultrix kernel. This created a few problems, not the least of which meant that a mistake in code that (with a user-space based kernel) would normally only cause the X-server to crash, the person to be forced off the system, and the login-prompt to re-appear, NOW would cause the whole system to crash due to some type of "kernel panic". We also noted that the X-server (which managed its own memory) would grow without bound, using all of the available kernel memory in a few hours under specific graphics loads. In a user-based X-server, this was (at least) tolerable, but in a kernel-based X-server, it caused the whole system to hang.

All of this was to save a few microseconds of system context switch time and to give better "feel" to the X-server. Then an engineer got almost exactly the same "feel" with the user based X-server by raising the scheduling priority on both the X-server and the window manager.

And I might point out that since that time kernel-based scheduling of lightweight threads has made this type of issue even less of an argument.

My personal belief is that there are certain things that an operating system kernel should do, which is schedule resources among hardware and processes, including memory and CPU time. All other things should be put out in user space. But the last time I wrote kernel code was twenty-five years ago for a PDP-8....

[and speaking of I typed this last part of the answer, the clock on my Linux system turned to:

[maddog@localhost maddog]$ date
Sat Jan 1 00:00:01 EST 2000 you can see that Linux is Y2K compliant]

6) How can you afford development?
by joshkerr

I don't understand how Linux can complete in the upper end server market, especially against competitors like Microsoft and Sun.

Microsoft is about to release Windows 2000 datacenter which will allow up to 64gig of ram and 32 processors. How can any one company afford that kind of equipment for the development of Linux?

Do you have any plans to recruit companies like Compaq and Dell so that they are major players in the development efforts of Linux? It seems to me that it would be benificial to have companies like this helping to direct the future development of Linux in terms of large scale applications. I realize that these companies are developing drivers and the such, but that isn't really what I'm talking about..

Apache running on Linux on a machine with 32 processors and 64gig of ram, able to out perform anything MS can throw at it. That is what I'm talking about...


First of all, let me point out that Sun has two major divisions, SunSoft and Sun Microsystems. While SunSoft MAY see Linux as a competitor, Sun Microsystems sees it as another operating system to help it sell SPARCS and Intel PCs, which Sun makes. Even SunSoft can look at Linux as helping to maintain the Unix marketplace, and perhaps re-creating the Unix desktop. This will, in the long run, benefit SunSoft.

As to the other large vendors such as SGI, IBM, HP and Compaq, each of these companies have engineers working on internal projects to help Linux xpan the larger types of hardware platforms. Unfortunately, as you get into these very large systems, there are several basic differences that can occur just in the support of NUMA memory alone. Different internal bus structures and architectures might make it very hard for one kernel to be delivered across all these platforms.

At the last Comdex show, in his keynote speech, Linus acknowledged the fact that these larger systems might have a radically different kernel (or kernels) developed for them, but that the kernel programming interfaces would probably remain the same.

Ergo, when you bought your distribution of Linux on a CD-ROM, for certain machines you may have to get your 'boot diskettes" from the people who ship you the machine. Or perhaps certain of the kernel functions might be included in the boot ROMS that come with the machine, and linked into the Linux kernel as it boots.

I know a lot of you think of Compaq as a "Microsoft Shop", but they also sell about 1 Billion USD of Unixware every year, and support 11 different operating systems on their Intel platforms. As long as Linux helps them sell a significant number of hardware platforms, they will make the investment in supporting Linux on them.

7) How about the software no one wants to write?
by moonboy

How about the software no one wants to write? By this, I mean the software that most programmers would consider "boring", yet is truly essential to the further growth of Linux as a desktop and server OS. It's great that we have so many window managers, office suites, browsers, etc. both existing and coming down the pike, but what about the other stuff that's just not as exciting? The stuff you really have to pay people to write? Maybe third party vendors with paid employees are the answer, but will all of those companies want to make their software truly Open Source?


There are several projects underway which are looking at the funding of "boring" Open Source tasks. Some of them are quite interesting in their approach. One might be to fund scholarships for college students based on how much documentation they write, or have written. Or perhaps making it a co-op assignment. On the other hand, perhaps we have to be more prudent in how we make these "boring" tasks attractive. We all like to listen to the person who wrote the code, but what about listening to the person who wrote the documentation? Perhaps more people would write good documentation, if they were the ones invited to the many conferences and trade events occurring around the world, freeing up the developers to spend more time at home writing code (or even at workshops writing code).

8) Beer?
by Mike Hall

I have had the chance to meet you at several LinuxWorldExpo's and USENIX etc.. At each of these events, you were always present at the parties with a large glass of beer.

My question: What is your favorite beer? and why?


I am not a great fan of the darker beers such as porter or stout, although this is not a hard and fast rule. For instance, I do like Guiness draught, and particularly when it comes from Temple Bar in Dublin.

As to the lighter beers and ales, I admit to being a beer snob, and I like few of the "national brands". Anchor Steam was my first "micro-brew", and still retains its unique flavor. Pete's Wicked Ale was a long-time favorite, but I feel my tastes have drifted away from it (or vice versa).

There are a lot of micro-brews that I like, and a lot that I really hate. Do not put any fruit in it, or strange spices, or "non-beer" things (such as hot mexican peppers). And PLEASE don't hand me a beer that you feel has to be improved by sticking a lemon or lime into it. Save it for watering the plants.

9) How you get the nick name?
by Kamelion

What ever happened to get the nick name "maddog"? Must have a pretty interesting story behind that, eh?


(SIGH) I have told this story so many times.....but perhaps this will curtail telling it a few thousand times more....

Once upon a time I taught in a small two-year technical college. The Dean of Instruction was a fine gentleman, but we did not see eye-to-eye in teaching students, Often we would have arguments, and often the arguments would get heated. During these arguments, often the entire school would hear both the Dean and my opinions on many topics. And sometimes these arguments would get REALLY heated (like the time the Dean hired and fired me four times on the same day). It was during these times that the arguments were too hot for maddogs and Englishmen. The Dean was British....

Finally, my own question and answer:

Q: Why do you like Linux so much? Why do you spend time evangelizing it?

The computer industry has been very good to me over the past thirty years. I have seen computers go from room-sized monsters to things that would fit on your wrist, or at least in a small pouch on your side. Yet I feel that there are still a lot of answers that have to be found and even tougher problems that have to be solved by users.

I am fond of talking about the applications that I have seen running on Linux as I have traveled around the world. People working on understanding how the Universe works in places like Fermilab or Brookhaven National Labs, people working to find new paints, new sources of energy, and other research projects using Beowulf systems. I am interested in seeing people reduce the cost of embedded systems projects by using a well-written operating system that is scalable and free.

And finally I really enjoy seeing people working in the Health-care space, trying to disseminate information that can help cure diseases such as AIDS or cancer.

As I saw Linux spreading over the planet, and being used in places like Sao Paulo, Brazil, or Korea, or China, I knew that the planet earth had to take every chance to find the next great mind in computer science, and that it was less likely to find this mind in a closed-source environment that had all of computer development funneled through Redmond, Washington. We had to have a mechanism for finding the next "Albert Einstein" of computer science, and I see Linux as a magnifying glass, waiting to help us locate that person.

And so to you, Mr, (or Ms.) Einstein, wherever you are... become involved with Open Source projects, and give the world a hand. It desperately needs you.

md - at the beginning of the new century

Monday: Steve Wozniak. Tuesday: two special "surprise" interview guests.

This discussion has been archived. No new comments can be posted.

Interview: a New Linux Year with Jon 'maddog' Hall

Comments Filter:
  • by Anonymous Coward
    Hear, hear! My thanks as well to maddog, for the same reasons. I met maddog at a NYLUG meeting (October '99, I think). He one of the nicest people I have ever met, and treated me like I might be a "young Einstein" (which I am not).

    When the evening of fine discussion and drinking of almost as fine beer was over, I realized that although I may not be the world's greatest programmer, I could strive to improve in that area... and I have!

    Thanks, maddog! I hope to run into you again some day.
  • by Anonymous Coward
    Perhaps if you're only serving static pages from that webserver you could saturate a T1 with a single P133, but in the real world the web has become quite a bit more dynamic and database driven. I can think of quite a few very large sites and applications that would benefit from a more scalable, NUMA-based, Linux.
  • by Anonymous Coward
    I can think of quite a few very large sites and applications that would benefit from a more scalable, NUMA-based, Linux.

    Most large sites that requires extremely complex dynamic content usually pass off processing like credit card or purchasing transactions to other servers. It is not done on the web server itself.

    There are no dynamic-page generation needs out there that require a "32 processor/64 GB machine", and even if you had the money to buy one for a site, you'd be better off buying dozens of less powerful machines to spread the load over.

    Web serving does not require overly beefy servers. This is why Linux and FreeBSD are pushing Sun out of many server racks - most Sun boxes are overkill. Current rackmount intel boxes with a decent amount of memory are more than sufficient to saturate a line, even with dynamic page generation.

    Remember, you want redundancy in web serving. If you have one pentium rackmount that is too slow, you don't replace it with one beefier box, but supplement it with another pentium rackmount.

  • If the FSF truly believed that software should not have owners, then they would not copyright their software. They should make up their minds one way or the other. Either software should not be owned and all of GNU becomes public domain, or that they keep the GPL/copyrights and merely remove that article.

    The FSF believes that software should not have owners; however, the law says that software has owners, and allows the software to be restricted. Making a work public domain gives up all rights to a work-- it can become part of lawfully restricted code, which (I'm assuming) strict GNU adherents see as an unacceptable abuse. In the FSF's perfect world, the law would not allow any ownership. Until that happens, however, the only way to insure that software you've written cannot be "owned" is to use your original authorship legal rights to place a covenant on the work, forcing all later authors to adhere to the same openness you created your software under. The GNU Public License is basically a well-designed hack: It uses the law against itself, preventing copyright from being used for anything except preventing copyright. GNU software doesn't morally have an owner; the GPL, drawn up within the realm of copyright law, just makes sure that anyone attempting to fight software freedom with copyright law will find himself holding the wrong end of the weapon.

    From a strategic standpoint, the GPL is a stopgap measure until the laws become more "enlightened". That is another beauty of the hack: The only practical law that could take the punch out of the GPL would be one that simultaneously made the GPL unnecessary.

    In summary: The GNU philosophy is that software has no owners, and the GPL insures that those who operate otherwise will fall victim to their own ways.

    On the other hand, maybe I'm completely wrong...

  • Let's make the others follow us through the mine field! The open source/free software community and the (computer) science community already have a lot in common and heavily overlap. We should strengthen this trend and make them blend. This way the front line of research and OSS will be the light that industry follows through the mine field.


  • I did not spring "chasing the taillights" as a FUD. I seriously believe it is a potential problem that Linux does not have (and I can't see how CAN it have, considering its open source nature) at least ONE compelling feature unique to it and with which it enjoys a lead over other OS-es. A gimmick to lead the unwashed masses to Linux on the desktop. Reliability might be considered such a feature, but I doubt it's important enough (and it's NOT good enough on the high-end, as Jon himself admits). Perhaps portability is in fact another such feature, as no other OS has been demonstrably so moldable to fit any architecture on existence.

    Jon is correct of course that a lot of today's progress is just implementing the ideas for which the minefield had been cleared with earlier systems. But I don't think he addresses my main point: - and now flame me for spreading FUD again - there's no conscious direction in Linux development, it's rather the attitude "we'll have all the cool stuff that all other OS-es have". Technically, it's very sound. But one compelling feature (or program layer or whatever you religiously believe is appropriate name for fancy stuff accompanying the kernel) would help in converting the masses, if not the PHB's.
  • ok, so this freaked me out... i have been joking with everyone that upon the beginning of y2k, the age of man would end, and the age of ED would begin (my name being ed) and then I see this...

    i don't know if I know you, or if this is just one of those funny things that makes me laugh yoohoo out my nose, but, i have to give you props on that little chunk of humor ;)
  • > but watch a former MacOS or Windows
    > user try to install an X server(heh).

    You mean an ex win95 user (like me)...

    bash-2.02# apt-get install xbase

    Yeah, that'd be a _real_ trial. :P
  • ummm, moderators too drunk to notice this is flamebait? insightful my ass.
  • He's actually saying that we'll change our minds when there's another chance for the world to end.
  • I would like to sharpen my claws on some of that boring code to be honest. I finally have enough time outside work to begin seriously learning to code on Linux boxes. (As opposed to wading through - shoot me now - Active Server Page code.)
    Digital Wokan, Tribal mage of the electronics age
  • I was blow away. Compaq set up a bunch of Alpha servers for people to testdrive on the Internet, and for the first time one of them was running BSD... why is Maddog pushing Linux so hard at Digital and ignoring BSD, when if it wasn't for BSD there probably wouldn't still *be* an Alpha market...

    I mean look at it, NT on the Alpha has always been a joke (let's spend a lot more money to run wintel apps in an emulator... slower and less reliably)... outside of a few maverick products like Lightwave (which started on the Amiga, for heavens sake) there haven't ever been any significant Alpha NT solutions.

    VMS? Well, there's a solid legacy market there, but they have never shown any interest in growing it.

    No, they've been growing their old Ultrix (hey, that was BSD) and OSF-1/Digital Unix/Tru64 (that's BSD too) market. That's where they've been pushing things.

    So is that it? Maddog could push Linux at DEC and get away with it because it wasn't seen as a threat to Tru64? Well, damn, I wouldn't be proud of that...
  • "Could someone name a comparable product with all of the features as emacs? I doubt it."

    Hah! I have no idea why Richard even continued on with his GNU quest, since emacs is an operating system all to itself. I hear the next version will even do my dishes for me! But features are not quality, and I could name a thousand people who claim that emacs is a pile of dingo droppings!

    Only the Free Software Foundation has the cojones to claim that tcl is evil and lisp as morally superior.
  • by serialk (22614)
    not too interesting compared to the other ones :(
  • by Vryl (31994)
    browse in lynx friendly mode and write a post processor to filter out anything not scored -1.

    the text is marked up regularly and systematically so this is theoretically easy.

  • Reiserfs is that one compelling feature that is *totally* unique to Linux.

    No other operating system had the openess needed to do that development, IIRC.

  • ...growing their old Ultrix (hey, that was BSD) and OSF-1/Digital Unix/Tru64 (that's BSD too) market. That's where they've been pushing things.

    As someone who has started working on an Alpha Cluster running True64 I have to say that, while it may have had it's origins in BSD, is about as close to it now as AIX is to either BSD or SV. True64 is as much it's own *NIX as anything else.

    So is that it? Maddog could push Linux at DEC and get away with it because it wasn't seen as a threat to Tru64? Well, damn, I wouldn't be proud of that...

    With Compaq creating/porting natice C and FORTRAN compilers to Alpha Linux you can bet they see Linux as something to make money on (which is not an evil thing, BTW). Linux doesn't have to be a compeditor to True64; it is and has been a very good compliment to it.


  • So where do you find information about these ancient astronauts?
  • You might be interested in Corel's distribution of linux. It is using a modified Debian distribution and is being designed to be user friendly enough for someone to fairly easily switch from windows or mac to linux. Their goal (as quoted from one of the guys at Corel I was talking to) is "To make linux easy enough for my mother to use it."
  • In some senses it's not that impressive to get an interview with Maddog. After all, Slashdot is a major site visited by Linux folk, and Maddog is "major Linux folk." That's a "natural" interview.

    An interview with Woz is "more impressive" in that it's something that you might not have expected. There's not as much reason for him to want to be associated with the denizens of Slashdot.

  • A console / terminal text editor that works like the more popular GUI based editors. For an example of what that means look at

    Pico, Joe and Jed are all nice and simple. All us Unix admins already know VI and most handle Emacs just fine. However the people who will supply the large user base that causes Hardware manufacturers to see Linux support as a survival issue don't.

    There you have it. Boring work. Unwanted work. Yet still stuff that people want to get done. I don't program, however I can take on some of the even more boring none code jobs.

    There are large bodies of code that do most of this already. I.e. 'mcedit' is around 90% there but it's wrapped up in 'mc' and as such runs from a huge binary. ( It would be nice if this thing fit on a rescue diskette ).

  • FSF and GNU are about exclusive freedom. Linux is about inclusive freedom; freedom for all developers and users, not just the chosen few.

    This doesn't really make any sense to me. I would say that FSF and GNU are about a vision, while Linux is about being the coolest hack in the word. (Really, what is linux other than the emacs of operating systems? Totally configurable, each person's instance has so many personal mods as to make it impossible for anyone else to use it, funky key bindings, etc. etc.) The Linux community seems to be trying to everything to everyone, and filling in the the gaping chasm in the information universe that is Microsoft's moral vacuum.

    Linux is a cool hack -- nothing more. GNU is pure freedom -- the kind that's kind of annoying and impractical. Open Source is the glue that binds us all together....

  • "You are against granting users freedom with their software?"

    Their philosophy is about much more than free software. It's also about the moral correctness of copyright violations (it would be wrong to deny your friend...), denying *use* of software to proprietary concerns (why you next library should be GPL), and even proposals for software taxes (gnu manifesto).
  • I can only imagine what goes on inside the mind of the FSF, so here are two suppositions on what I think they want the "perfect world" to be.

    1) Software is held in common by the government or government-like or appointed trusts. Private ownership of software is not allowed. But RMS has emphatically maintained that he is not communist. So I'll reject this one right off.

    2) Copyright laws are repealed and no one may hold legal monopoly on any information. So what happens here? Will everything be Free Software? Hardly! Nothing would stop closed-source software. "Here's my binary, but I don't have to give my source code." I would suggest that such a world would see a sharp rise in onerous copy-protection schemes and NDAs.

    Software ownership is the key to the GPL. Whether the FSF likes it or not, the only way to reach their goal of a free software world without totalitarian means is to allow software ownership.

    By saying that it's wrong for me to copyright software and put restrictions on it, but that it's okay for them to do the same, the FSF is acting incredibly hypocritical and arrogant.
  • "No. The FSF is about free software. Period."

    If this were true, then why do they go to such pains to disparage and denigrate *ALL* other Free Software licenses?

    "Wrong. I suggest to read the essay on the FSF website entitled "Why Software Should Not Have Owners.""

    If the FSF truly believed that software should not have owners, then they would not copyright their software. They should make up their minds one way or the other. Either software should not be owned and all of GNU becomes public domain, or that they keep the GPL/copyrights and merely remove that article.

    "Why are you intent on holding the FSF in such a derogatory manner when its purpose is to write software and give it to YOU."

    I am not against any individual belonging to the FSF or GNU Project. However, I am against their philosophy. This is a much different thing.
  • ... is precisely why OpenSource (as a pure information stream) works. The biggest issue with learning is not technical skills, but having enough savings to create "leisure time" to spend exploring the hacker culture. I find it interesting that as we evolve from Agarian, to Industrial to Service, to Knowledge economies, the age at which one becomes marginally employable increases. Thus while a 12 year old can herd cows, you need quite advanced knowledge (usually only available in unis) to do stuff like bioinformatics. Because of its low cost of acquisition, anyone can learn, thus avoid the caste effect of private or Ivy League Schools. What mankind needs is to help identify the next Einstein, the next Leonardo da Vinci, and (dear I say it) the next Messiah (or appropriate moral/philosophical teacher). Linux won't solve everything but at least it gives people with an open mind and a will to learn a chance to prove themselves rather than staring at the idiot box all day.


  • interests can actually influence acceptance of an OS purely on country of origin as long as their language is supported as well as any other.
    I started to use linux because it worked not because it was multi-national in origin.
    I would be quite suprised if people didn't use it if it wasd created in the US.
  • All they care about is hiding, eating, and reproduction (or the failed attempts thereof).
  • There are actually several software houses that I know of that do operate outside of the US.
    The problem comes in getting enough capital and the specific barriers to entry closed. I would more than happily use a piece of software made in Zaire if:
    1. It has various UI things and such written in english.
    2. It works well.
    As it stands now for various console games and such there are games that limit an English speaking audience to play (written in Japanese charaters). I don't see a problem with people writing softwre of myself using that software that was made.
  • I think that there would not have been any comparable utilities that are as a better quality than what is uot there.
    Could someone name a comparable product with all of the features as emacs? I doubt it.
  • I'de like to take a moment and thank /., it's readers/poster and Jon Hall for taking the time out to post such a wonderful interview. It's inspiring to hear from a leader in the comunity.

    While looking around for other information on Maddog, I stumbled across this interview @ IBM : developerWorks : Linux overview : Library - papers [] Of Jon-da-Maddog Hall done back in October of last year.

    A quote from the article:
    maddog: Linux is the only operating system in major usage today that started outside the United States. While it is true that a lot of operating systems were worked on multi-nationally, Linux started outside the US. This flavors the way that other countries look at it.


  • For being out there every day in the Linux trenches. Thanks for treating ordinary Linux users as if each one may be the next "Einstein" (I'm not the next Einstein, but when I had the pleasure of meeting you at LinuxWorld, you treated me like I could be). Thanks for being one of the most practical, even-handed and enthusiastic ambassadors that this community has. And thanks for taking the time to brave the /. flames to thoughtfully answer the questions here.

    Besides, you have good taste in beer.

    Happy 2000!

  • by auntfloyd (18527) on Saturday January 01, 2000 @11:19AM (#1421994) Journal
    It's been trendy for years, and your *just now* getting into it?

    Get with the program. If you're not a troll on Slashdot, you're Nothing. *NOTHING*. People will skip your posts unless they've been moderated down to -1. Who wants to read anything else? If I wanted something informative or insightful, I'd read the original article or interview that the comments are attached to. If I want something funny, I'll watch comedy central or something, not read a few lame attempts at humor by people who find Bill Gates being hit by a pie the funniest thing ever.

    But, out there in the Real World, can I find hot grits being poured down people's pants, or Natalie Portman being turned to stone, or MEEPTing, or people rushing like drunk cheetahs to be the first to say something?

    The answer, simply, is no. Trolling is a true art form, and when it is done well, the result is a something much better than reading some geek's take on the latest MS news or something. Trolls reflect people's real opinions; anonymity allows them to express themselves without fear, be totally uninhibited. It is a raw look at humanity in it's most basic form. Sometimes it is funny, sometimes boring, and often graphically, disgustingly perverse.

    I wish there was a way that I could only read -1 posts and skip the trash, but that has yet to happen :(

  • by bkeeler (29897) on Saturday January 01, 2000 @07:51PM (#1421995)
    I always hated the "chasing taillights" analogy. I suggest that whenever someone springs that particular piece of FUD on you, you rebut it as I do, that is by explaining that it's more like following someone else through a mine-field.

  • by Inoshiro (71693) on Saturday January 01, 2000 @01:59PM (#1421996) Homepage
    Microsoft is about to release Windows 2000 datacenter which will allow up to 64gig of ram and 32 processors. How can any one company afford that kind of equipment for the development of Linux?

    I'd just like to say to that poster that the 2.3.x tree supports scaling to many proccessors; has initial NUMA support; and has 1, 2, 4, and 64gb support.

    Of course, you need a Xeon and the special Intel motherboards that support extended addressing, but I'm sure that if you can afford 64gb of ram, a few extra k$ on a mobo is nothing :-)
  • by dsplat (73054) on Saturday January 01, 2000 @05:32PM (#1421997)
    And a lot of different countries would LOVE to have true internationalization and localization done, so just by changing a message catalog (or adding to it) an operating system or application could be localized for a particular culture.

    Jon, thanks for mentioning this. I'm not surprised since Linux International is hosting the mailing lists for the Free Translation Project [] teams. I wanted to mention that there are several projects going to to try to achieve internationalization of Linux and free software in general. If I have left any out of this list, please speak up.

    There are also pages for internationalization and localization of several projects and distributions (URLs welcome).
  • (by the way, I was the AC you are responding to. I couldn't get slashdot to log me.)

    "If this were true, then why do they go to such pains to disparage and denigrate *ALL* other Free Software licenses?"

    They don't. See .html [] for their official position. Note in particular what they say about the modified BSD license, the BSD license without the advertising clause. "It is a simple, permissive non-copyleft free software license with no particular problem."

    "If the FSF truly believed that software should not have owners, then they would not copyright their software. They should make up their minds one way or the other. Either software should not be owned and all of GNU becomes public domain, or that they keep the GPL/copyrights and merely remove that article."

    I think they are talking about a different kind of ownership. With free software, the author no longer has control of his/her software. While conventionally the owner has exlusive control over his property. In this sense, the FSF does not own GNU because they do not control it. This makes a lot of sense. Most people do not think of FSF owning GNU or of Linus owning Linux, simply because they grant so much freedom by licensing with the GPL.

    "I am not against any individual belonging to the FSF or GNU Project. However, I am against their philosophy. This is a much different thing."

    You are against granting users freedom with their software?

  • by Arandir (19206) on Saturday January 01, 2000 @11:29AM (#1421999) Homepage Journal
    Maddog said that trying to get Compaq to work with the FSF was just too hard. He also said that nothing's too hard with Linux. Why is this? Simple.

    FSF operates under a philosophy that only allows one way of doing things. They (obstensibly) relinquish ownership of their software, but keep an incredibly tight leash upon it. They are obsessed with MIT's "Right Thing".

    Linux, on the other hand, allows any philosophy at all. Linus kept ownership of his software, but shared it with the world. And the world shared with him. Linux is about coding, not preaching.

    FSF and GNU are about exclusive freedom. Linux is about inclusive freedom; freedom for all developers and users, not just the chosen few.

"Stupidity, like virtue, is its own reward" -- William E. Davidsen