Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Linux Software

Linux and the New Computing Order 207

Chris Siegler writes "An Op-Ed over at Dr. Dobb's, on whether Linux can change to meet the needs of the mainstream user, and the consequences if we don't. " Interesting piece-talks about the potential for fragmentation, and that all of the big name companies coming to play are probably the greatest potential cause of that. I'm not incredibly worried about fragmentation, but more about how things change can alter the community. What do you folks think?
This discussion has been archived. No new comments can be posted.

Linux and the New Computing Order

Comments Filter:
  • These types of things will need to be changed at the core level (I don't mean core as in Kernel, just deep-core). No GUI program can stick a input box for LP switches. That's silly. I love printing in Windows. There is nothing to it! These types of things need to be changed completely. But like I said, all control STILL remains because of the openness. :)
  • Why does the Linux community need to organize? Haven't the computer industry and computer business interests yet learned that Linux users and the Linux 'movement' don't necessarily work like 'conventional,' or 'traditional' business models or even share the same interests? I'm getting tired of reading pontifications by editors and journalists who don't seem to understand what Linux really means: freedom from software tyranny! a rebirth of the magic of computing and sharing of ideas! viable software alternatives in the face of bloatware, expensive upgrade treadmills, and usurious (sp?) licensing. Besides, what will happen if we don't organize? Will Linux go away? Do we all want to see Linux take over the software industry? Do we really care about Microsoft?
  • I've noticed that Caldera and MacMillin(sorry probably misspelled) dist of mandrake are forceing the user to install and use Boot Magic (proprietary closed source software), even if you want a 100% linux box. This essentially prevents one from legally installing it on more than one PC. Seems kind of sneaky, and seems to violate what Linux is all about.

    With boot magic and the custom install that Caldera uses, how would that affect changing over to RedHat or some other version if you didn't want to go through the process of completely wiping out your harddrive and starting from scratch?

    Not wanting to get flamed, but hope someone can answer this with a reasonable explanation. No Trolls please.
  • Well, the rate of kernel release isn't as egregious as you make it out to be. Kernels x.n where n in {1,3,5,7,9} are development kernels, and aren't designed to be released for general consumption. Kernels where n in {2,4,6,8,0} are the release kernels, who are usually based on development kernel n-1.

    As usual, someone smack me down if I'm wrong.

  • This would be a good method. Remember how MSFT killed off other DOS versions by that sneaky "error" message if you didn't use their DOS?

    I strongly encourage this, should MSFT try to pollute\\\\\\\extend Linux.

  • Where are all the applications? In nearly all cases, same place you find Linux applications. FreeBSD has excellent Linux emulation. As for source-based applications, as long as the developer hasn't been sucked into writing Linux-specific trash (the ever popular 'linux' include directory, or /proc) it should compile just fine. Why do rabid Linux users insist on bringing this up all the time?
  • You're absolutley right. Problem is we don't like to think about that, luckily however neither does MS. For MS to do that sucessfully they would need to drop windows soon so the mainstream would move to linux, which is taking an awfully big risk. If they wait till the mainstream is already moving to linux then people will remember how shafted they were with previous MS products and stay with something else.
  • Except for hackers and tinkerers, hardly anyone spends any great percentage of their computer time installing operating systems, applications, or new hardware. Usability and stability are what count for most users.

    Example: my wife and I recently had a 7-year-old grandaughter staying with us, and to Kionna the only *usability* difference between Windows and Linux was that when she was looking at Java-intensive "kiddie" Web sites and they crashed Netscape on my wife's Windows computer, it took several minutes to reboot, but when Netscape crashed on one of my Linux PCs, it only took a second to restart Netscape.

    Within a week, given a choice, Kionna headed for one of the Linux boxes and ignored our one remaining Windows computer.

    This is not a scientifically valid survey, but if "our" little girl is any indication, Linux is now at least as easy to *use* as Windows for someone who has no prior experience with either OS. :)

    BTW, Kionna likes Gnome better than KDE. While I use KDE as my default desktop, it was no big deal to set Kionna up with her own Gnome-as-default user account. Try *that* in Windows. Not only that, but she had played hell with some settings on my wife's Windows PC by pushing random keys, and had to be told what and what not to touch, while on the Linux PCs she could do anything she wanted without doing any harm.

    New Linux slogan: "Best OS for grandchildren!"
  • This is sort of random and not quite on topic but here we go:

    Why does everybody follow Microsoft's ideology when it comes to the upgrade cycle? For some reason, people seem to think that if a new kernel comes out, they MUST upgrade, and if a new Red Hat comes out, they MUST upgrade, and if a new Debian comes out, they MUST upgrade... but this is not at all true!

    MS likes to push the idea that if the upgrade their OS, everybody should upgrade to that version because it is "necessary". People are taking this thinking and saying with regards to Linux things like "what if the C libraries are upgraded and it breaks something? when I upgrade, I'll be screwed!" or "what if a new kernel version doesn't work with my hardware that the previous version worked with? when I upgrade, I'll be screwed!". The thing is... YOU DON'T HAVE TO UPGRADE. If you have a server running firewalling and file sharing and mail and web servers, and it works fine, DON'T UPGRADE IT. EVER. There is NO NEED TO.

    Sorry this has turned into such a rant, but it gets really annoying when people talk about a new version of a piece of software or an OS as if it is mandatory to upgrade to it the moment it is released. If it ain't broke, DON'T FIX IT!! Just because MS wants you to buy another $200 version of their OS, or Adobe wants you to buy another $800 version of Photoshop, doesn't mean you need to do it. If you don't need the features of the 2.2 kernels, and you're running 2.0.37, stick with 2.0.37. You have no need to upgrade.

    Well... end rant for now. Sorry about that! :)
  • Personally, all of my hardware is hand picked to run Linux, but what about the new user who buys a PC, then hears about how much better this "Linux" OS is over Windows, goes out and pays $50 for Red Hat, and finds that he can't get his USB mouse to work, his PNP sound card etc? Should Red Hat put a message on the screen that says:

    "Screw you buddy! You bought hardware to run Windows? So run your Windows and shut up!"?

    Among all criticism directed toward Solaris I never heard anything about its inability to run on HP-PA -- if you want HP-PA, you buy HP-UX, and if you want Solaris, you buy Sparc.

    That's a totally different argument. Solaris doesn't claim to run on HP-PA, but Linux DOES claim to run on Intel PCs

  • One distro.

    With sysinstall as the installer.

    Sane release schedule.

    With random version numbers, and -STABLE and -CURRENT also known as -STALE and -BROKEN.

    Rock solid.

    Especially -RELEASE versions.

    Unpredictable dependencies between everything and everything -- you don't upgrade applications and libraries, you upgrade the system.

    Wild claims about network performance being significantly better than one of Linux (last tested against version 1.2.13)

    A lot of noise about pain of Linux changes while FreeBSD just went through extermely painful, not usable with standard upgrade procedure (simultaneous conversion to ELF, OpenBoot, CAM, etc.), change between versions at the rate, known by Linux users only in 1994.

    No efforts to develop anything desktop-user-oriented that will install out of the box.

    The solution to your problems is right under your nose.

    Don't get me wrong, I use FreeBSD and like it, but most of FreeBSD "advocacy" is as much arrogant as most of Linux "advocacy" is immature.

  • What says that Linux has to do a big U-turn to keep it from having to play second fiddle to Micro$oft? As I see it, it's already on it's way:

    KDE, GNOME and GNUstep/WindowMaker are all projects trying to bring Linux into the 'point-and-click' realm.

    The kernel is being worked to support all of the new standards for computer peripherals.

    And distros such as Caldera are trying to eliminate the hassles of installing Linux.

    At some point there will of course be a cutting of corners (I'm afraid that GNUstep will have a hard time making it through to the mainstream, which really is a shame) to make things fit together, but as far as I know, most hackers are smart enough to make this work pretty smoothly.
    And if M$ would release a Linux distro, so fscking what? They would probably make a really good job giving it a good UI and making it user friendly for people coming from a windows environment. But I'd say that they would have a pretty damn hard time switching to Linux, not to mention actually developing their own proprietary version at the same speed as the current kernel. I just don't see that happen.

    Wow, my longest /. post so far :)

    Anders.



    ____

  • Actually, RedHat's /etc/rc.d hierarchy thing is a standard SysV way of doing configuration scripts, unlike Slackware's BSDish way (which basically amounts to little more than autoexec.bat).

    FreeBSD has also moved away from this 'BSDish' way of doing things - as a system administrator I find it makes it a lot easier to customise the startup procedure on a machine, and move startup scripts between multiple machines for common packages.

    I know it's been in FreeBSD release versions for at least 6 months, though I think it's quite a bit longer.

    Felius


    --
    make clean; make love --without-war
  • Well, I for one don't miss having to modify my /etc/rc.d files myself and worry about breaking things. Granted, Debian packages tend to be much better-behaved, and I only use official packages (which AFAIK have to fit *all* of the packaging standards, as well as come from a trusted, signed source) so I've yet to be burned by a rogue package inserting a bad startup script. Ahh, peer review...
    ---
    "'Is not a quine' is not a quine" is a quine.
  • I think you're absolutely right about what the average computer user wants. I don't know if Linux CAN deliver the experience that they desire, though I see no reason why not - it's really just a matter of packaging and pretty (software) wrappers.

    Where I really can't agree with you is the suggestion that low (computer)knowledge users get the experience that they desire from MS (gee, I've never noticed before that their initials suggest a nasty disease :-) products. For instance, my wife does just about all here work on her computer at home. She is a pure user, without a trace of geek (though maybe a little wanabe). Mostly she uses Filmaker Pro to keep track a her client's business for him, but she also does some web page creation/editting using FrontPage. Before her recent upgrade to FP 2000, she experience continual flakey behaviour, forcing her to repeatedly reload FP. Just before the upgrade the only way she could open a page to edit was by first opening a page she hadn't touched in several months and then opening the one she wanted to edit. Things were looking grim, as she was running out of old, untouched pages. Even now, when she alters a gif, she has to go through a ritual dance of saving the gif, reloading and refreshing pages, closing and re-opening applications to see whether a simple change has had the desired effect.

    I assure you she does not enjoy the windows "experience" and totally fails to understand why software companies think they can release non-working programs then charge a fortune for "upgrades" with "features" that she has no use for and that rarely work any better than the program that she wants to use, if only it would work.

    Microsoft knows what the average user wants and slickly promises him/her just that, but they *don't* deliver, which is why the chairman of GM can make pointed jokes about MS software reliability. Lack of performance, whether speed or reliable behaviour, is also a significant nuisance to an ordinary user and if they don't wise up soon, this is what will really collapse the MS house of cards.
  • I think you may have simplified things too much. I've been told that in teaching large college freshman classes in computing that have PC (Windows) and Mac sections, that the students in the PC sections come out understanding the inner mechanics of the OS, while the students in the Mac sections tend not to. (This data is about three years old).

    I take this to mean that PC's still suffer from the "command"ism that you are accusing Unixes of, and that the two are more similar than what you are indicating.

    Since Windows labors more under the burden of "backward compatibility" than the numerous Unix windows systems, it seems reasonable to wonder if one of those latter systems will both pass up Windows in functionality for non-experts AND win the global popularity lottery.
  • by deity ( 8806 ) on Wednesday August 04, 1999 @05:34AM (#1766740)

    On the whole, I liked this article. I just have a few things to comment on.

    What if the Linux community continues to do things like change the C/C++ libraries and compilers, as they did recently, which triggered so many compatibility problems?

    Now, I don't consider myself an elite hacker or anything, (I do write and compile a lot of software though) but I haven't noticed any compatibility problems on either of my machines. But maybe that's because, like most of the world, I don't try to be bleeding edge. Neither will the mainstream. Companies like Red Hat are terribly concerned with compatibility issues, and because their money is on the line, they make sure that compatibility will be a matter of upgrading x number of packages. The cooperative and open nature of the open source community makes the turnaround time on fixing broken stuff much shorter, we all know that. But why? Because if someone with a large stake in Linux needs compatibility with new libraries or compilers, they can easily

    1. fix it themselves
    2. use their resources to aid the developers in charge of the project(s) in bringing it up to date
    History seems to show that Linux companies usually choose the second option, since that is the fastest and most efficient way to go.
    What if someone decides to do something you hate with your program, such as make changes that preserve compatibility with prior versions and break compatibility with your latest release, and then spend millions of dollars to promote their version?

    This is the new Coke dilemma. Sorry, I just had to say that. Seriously though, this does happen. It's the problem best exemplified by EGCS. EGCS was a split from the main development track of GCC, but eventually it proved to be more promising than its predecessor and it was brought back into the fold. Our community is better for it.

    Like our mythical political party, the community has to learn to compromise and be more understanding and accommodating of the mainstream.

    To a point. The Linux community, through projects like GNOME and KDE, has made the important changes that the mainstream will want. But as a partisan, I have to say this: Let's not forget our Free Software roots. It's terribly important that we focus on the ideals that got us this far, like unabashed source availability and respect for the owners of open source projects (go here [tuxedo.org] if any of this is new to you). These are the ideas that have protected and nurtured the movement this far, and they will continue to do so in the future.

    -k

  • by Anonymous Coward
    The number of developers and potential developers on this platform is increasing geometrically. Also, the preexisting code bases and documentation are going up at a similar rate. If you follow Freshmeat [freshmeat.net] you will see what effect this is having on the rate of change for the entire system. Perhaps some components have stabilized enough that noone is working on them any more, but there will always be plenty of new projects to work on.

    Having the source code available pretty much ensures that anyone can do anything they want to the system. Some people will want to slow everything down because they can't keep up with the rate of change to the system. However, they will be unable to remain competitive with the fast moving systems over time. Some others will try to add in non-libre components to the free systems in order to control them, but most of us know better and will route around that kind of damage.

    By far the worst threats are software patents and other forms of legal monopoly that artificially prevent free systems from competing. However, even those can be routed around given enough time and effort (and hacking of the legal system). bladeenc and the MP3 patents are a good example of how we might go around such measures.

    It's also important to keep in mind that we have the source code. For the same reasons that the linux apocalypse [lameter.com] could happen, we could one day find ourselves all switching to the HURD [gnu.org], or one of the BSD [bsd.org] deviates, or some competitive new kernel that runs well on hardware that doesn't even exist yet. The only pieces of the free systems that won't survive the transition are the ones without source code and the hardware-dependent pieces.

    Hi Mom!

  • This is where someone like MS could make a killing in the Linux market. Sell their GUI and their Apps and whatever with a Linux kernel. Even though they would not have to release source for their software, the fact would remain that they are not the ones controlling the kernel so essentially the Microsoft monopoly is rendered useless. Anyone could download or buy a copy of Linux from anyone they want to. For the first time in many years MS products would have to stand on their own merits, as even MS Linux would run other people's apps.

    Alot of people think the success of Linux means the doom of Microsoft. Not Likley. It just means they'll have to play fair again.


    -Rich
  • Interesting article. I'm sure none of Linux's current 'kindly uncles' would carry out the kind of 'embrace and extend' operation being anticipated, since they are mostly firms that are either obliged to act in their customers interests because they are very small (Corel), or firms that have comitted relationships with a customer base they really don't want to annoy (IBM). Most radical changes made within the community are in users' better interests, and that forms a basis on which companies and the developers of a project could handle them. What is more, most companies involved are trying to leaverage Linux's momentum to restore or add to their own. That means they have no interest in encouraging or supporting fragmentation.

    The thing the article raises that is interesting is the question of what would happen is an actively hostile company, one whose operating systems are competing with Linux, were to produce an incompatible version and try to get people to use that. That could be quite destructive, but I think it could be dealt with. The GPL forces anyone deriving their software from a version of Linux to publish their source-code. That published source code would form a basis for the Linux community to work to restore compatibility.
  • I fundamentally agree with you.

    Linux is an OS for people who want an open source platform that is fully customizeable. Microsoft, on the other hand, supports a crowd that cares more about ease of use than the internals of the OS. I want complete control over what I install and how my system is organized. I've paid 2K for my system I deserve that much.

    I have no problem with novices using Linux, I think it is a great learning experience. As long as those novices become experts sooner or later.

    Linux IMHO is an expert system and should be treated as such.

    Inherently there are more novices than there are experts, but the Linux community should not have to bend over backwards for novices. That is Microsoft's job.
    ---------------------------
    ^_^ smile death approaches.
  • ... I'm afraid.

    As more and more organizations come to depend upon the platform they will be less and less willing to accept churn such as the libc cutover.

    Someone will have to concentrate upon bugfixing, consolidation, standards compliance, documentation, automated regression testing, etc, etc. This isn't glamorous but there's a market need and it must happen.

    So the fragmentation will be in versioning: people who rely upon the platform to support mission critical services and commercial software will be running two or three year old distributions, while the hackers will be running bleeding-edge stuff on their desktops.

    This intertia will upset the kernel developers and kernel development will become less glamorous. The focus of innovation will move even further toward end-user applications.

    This is all good.
  • And HOW, pray tell, are you going to do this, when you distribute the source code? You can't prevent people from modifying your GPL'd source code. Haven't you ever read the GPL?
  • I'm sure I'll get flamed to hell on this. But this is what I think.

    I don't think that Linux CAN meet the needs of the mainstream users. Most end users when they get their PC don't want to have to have to learn commands, figure out short cuts. Then want to take it out of the box, plug it in and have it run. This is why windows has become the high focus OS that it is today, you don't have to learn commands. You just point, click, and the application runs, you don't have to compile applications and then compile it. I use UNIX in my daily life, on my home computers and at my place of work. But my use of UNIX didn't come in one day, with windows you can use it from day one no extra commands to learn, nothing to do really. UNIX is clearly the more stable OS, and for a server nothing is better. But you actually have to work to learn UNIX, its really not hard but you have to learn multiple commands and figure out HOW to use things at first if you don't know. Sure there is man pages and HOWTO's but end users with a new PC don't want to read all that, they want it to work right then with no waiting on how to figure out how to use it.

    I'm sure you script kiddies out there will be all over me for this, but I hope the users of Slashdot with a brain can figure out where I'm coming from. I will probably get stuff like "WINDOWS SUCKS!! USE LINUX!! YOUR OS SUCKS!!". But come on guys, read what I had to say, and think about it for a minute.

    Thats just my .02.
  • To change the Linux system would require loss of the tight control and configurability we expect from Linux. Just as Windows uses arrays of Dialog Boxes to configure everything, Linux would need the same. This is not to say that is a bad thing though!

    Linux still has better task handeling, a much sweeter file system, and most of all is open so that that lost level of control is actually still there. It's just a bit harder to get to.

  • I think that putting a user-friendly shell on Linux (such as Win95 over DOS) is a good idea. Here's the difference between how to do it in Linux, and how to do it in DOS:

    In Linux, we have a user-friendly front end, but we don't require it.

    Linux needs two front ends. One is the simplistic interface for someone who just wants to check their email and write memos. The other one is the current X-based "experts only" mode.

    So long as the OS under the hood is rock-solid, the end user doesn't care. Having that second interface gives you more control (thus more power) at the cost of needing more mental effort. Drop that complex interface, and you lose the hackers. But here, you can have your cake and eat it, too.

  • Games. In order to have games on a platform, you need to have a lot of people using that platform. Look at the iMac. Only the most popular games get ported there because there aren't enough users for the risk to be worthwhile otherwise. Until Command & Conquer 2 is released for Linux, I will have to continue to dual-boot. Half a drive for the real OS, and half a drive for the toy OS.


    Using Microsoft software is like having unprotect sex.

  • You are absolutely correct. Being Unix clone it will never capture majority of mainstream users. It is Unix, designed and optimized for programers not the users.
  • Just a thought that has been rolling around in my mind . . .

    Let's say Linux has kicked the stuffings out of Microsoft, & it's now sold preinstalled everywhere -- over the Internet, by phone, & at Sears & CompUSA -- & to anyone.

    And you are a phone support tech who has to support people who start calls with, ``I just bought my first computer, & I can't get it to work."

    If that thought does not strike fear into you, then you have never done phone support. We are talking about the truly clueless here: the folks genetically unable to tell their forward slash from their backslash, who can't be bothered to turn their cap lock key when they type in their passwords, people who have to be led by the hand to find icons on their desktop -- in short the folks who shouldn't have root on their very own systems. People too dumb to figure out how to use a Macintosh.

    ``But computers ought to be easy to use!" they always whine. ``Why do I need to read a book to use my computer? All I want to do is [ email XXX | write my term paper | play solitaire | look at the pictures on the Internet ]."

    Unfortunately we are going to have to support these morons if we want Linux to make it into the mainstream. Unless we can convince them that if you have to be trained & licensed to drive a car, you should at least read a book or two before you boot up a computer & proceed to trash their system.

    Yes, I did my time in Tech Support. Almost two years. Why do you ask?


    Geoff
  • Have you tried any recent distributions? Have you tried KDE or GNOME? KDE is at the point where it is as easy to use as Windows - and GNOME is getting there. With KDE, everything is point, click, and drool - couldn't be simpler.

    When KDE 2.0 with KOffice comes out, that may very well be a killer app - KOffice already looks pretty good, and it's still in the alpha stage.

    It's not as tough to use as you think - my brother, who doesn't want to learn ANYTHING, is getting along just fine with WindowMaker, GIMP, and AbiWord.

    So unless you've actually had a newbie to linux TRY KDE or GNOME, don't knock 'em.
  • The point is that we, as a community, don't need more users; we need more developers. Linux users should aspire to become developers, and existing developers should aspire to use their skills on Linux. If we slowly give that up and let corporations handle it for us, then we're also giving up control of Linux, GPL notwithstanding.

    Linux developers won't give up. Developers are also users who would probably want to use the free/Open Source apps that fellow developers use. And as long as Free/Open Source developers develop an excellent product at no or little cost, then corprations contributing to that market would have to make a knockout app to compete, or GPL it.

  • Better file system than NT ? Are you sure what are you talking about ?
  • If NetBSD, OpenBSD, and FreeBSD all used the same kernel, this statement would make sense. But they don't. Each of the BSD's has their own goal, and most development is done towards the furtherance of those goals. NetBSD has portability, OpenBSD has security, and FreeBSD has stability/performance (at the expense of portability).


  • I think upgrade inertia will help mitigate some of the compatibility problems. I'm working for a fairly large gov't department (5000+ employees) and we just moved up to Win95 on the desktop last year.

    Companies won't want to upgrade every few months, so they will always lag behind and that will give the bleeding-edge stuff time to mature and stabilize. My guess is that things will fall out so that the bigger companies SGI, IBM,... will end up stuck with the 'standards compliance, documentation, etc' grunt work and the hackers will keep getting to do the cool stuff.

    And we're already used to Win-style backward compatibility, so Linux probably couldn't do any worse.

    Maybe we'll see Linux Certified Systems Engineers appearing :)

  • You left out WinCE :)

    I've written a lot of code for Windows, though, and I think you make a good point with regard to fragmentation. Generally, the APIs are the same and 9 times out of 10, you can just call the Win32 function without worrying about it.

    The bite comes on that 10th time. For example, the Windows Remote Authentication Server API (RAS) has significant differences between:

    Windows 95
    Windows 98 (and Windows 95 OEM Service Release 2)
    Windows NT

    It's a little tense when you have to explain to a executive why your nifty RAS program does stuff on your NT workstation that it can't do on your customers' Win95 machines. "It's all Windows, isn't it?" No, it's not.

    FWIW, I also recall one case where I used some Microsoft example code that obtained a handle to a device context. The example declared the value as an int and everything ran fine on the Win95 machines. It crashed horribly on WinNT, though. A little debugging revealed that the value on WinNT was greater than an int could hold. I declared it long and everything worked ok. What peeved me was that this little quirk was not mentioned anywhere in the documentation. As far as the documentation was concerned, the function calls were the same between Win95 and WinNT.

    Other people who program Windows will justifiably call me a wimp for whining about having to get the OS version sometimes. That's fair. I also shouldn't be too hard on MS for errors in documentation that is, most of the time, pretty helpful. Finally, a couple of bad experiences from one developer is not grounds for condemning an entire operating system.

    But I think it's also important to point out that the Win32 API is not the same everywhere. If having to check my OS version before a function call is not "fragmentation", then what is?

    (Sorry for the length of this post)
  • It doesn't even HAVE to change - it just needs to be added to. For instance: take KDE with KFM. Power users will probably still use mv, cp, rm, etc., but GUI users can just point 'n click to get to their files.

    It should be the same way with configuration. Have GUI programs to configure the most common options, but if someone needs to do something fancier, they can dig into the actual text configuration files. That way you get the best of both worlds.
  • Right on!
    ---------------------------
    ^_^ smile death approaches.
  • ...will not come to pass until it is conventional wisdom that Linux will inevitably win. For MS to release their own version of Linux would be perceived as their conceding the ultimate victory of Linux over Windows. They will not participate in the destruction of Windows -- until there is nothing left to save.
  • >Just as Windows uses arrays of Dialog Boxes to configure everything, Linux would need the same. This is not to say that is a bad thing though!

    Maybe, maybe not. I had an ugly experience last night with the dreaded "Windows Dialog Boxes" last night. My wife uses Windows (what can I say? I've tried) and I have upgraded her machine by building a new AMD K6-2 box and them moving everything over from her Pentium. I used drive copy to clone her hard drive so I could set up the new box with a new drive while she still had her machine running.

    So everything was going great until I was ready to move the NIC. ARGH. Plug and Play and install dialogs are broken horribly. God I just wanted to find the config file and massage it by hand to get the bugger installed. I had to go through all kinds of gyrations to convince windows to take the NIC.

    If Linux had dialog boxes and still had the underlying open configuration mechanisms, the would obviously be fine. It just makes me shudder to think of Linux having ANYTHING like Window's dialog box interfaces!

    Bleah!
  • I think you are incorrect. The Linux community is not a homogeneous mass of script kiddies; the young and excitable just tend to be more vocal. The issues of point and click installation and UI are being addressed (as we dicker;) by some people with a very good handle on the problem,,, KDE, GNOME, for the UI are both making great strides and X* is infinately more flexable than the WINwhatevers. Initial install, Hardware, installation and drivers, are the real cancker that gnaws - and this too seems to be a major focus by most major distributions. I believe that Linux can be a very viable home desktop especially if we address,life as an imbedded OS in a non tradional device (set tops and integrated net appliances) over high speed access media (ADSL, cable, et al.) now and not later. Still the best market for Linux right now is as a server OS where its' advantages are legion and its' short commings should be short lived.
  • USB - being addressed, will be available in the 2.4 kernel. PnP - also being addressed

    Pnp was "being addressed" for four years now.

    Parallel port scanners - I believe some are supported; if yours isn't, write a driver.

    Oooh I was waiting for this! I work full time and have a family-- I don't have the time to write and maintain a driver. My wife gets upset when I spend too much time using the computer as it is.

    Anyway is that the answer we want to give the newbie who barely knows how to install Linux, and has probably never coded? Write a driver?

  • I deal with mainstream computer users every day as a network administrator. Many of them have a fear of their computer, anything outside their limited experience is, to them, impossible. To get them to learn a new program is extremely difficult. Some "experts" say that trying to get users like them to use Linux will be next to impossible. Linux is too complicated for the average user. Is Windows 95/98/NT any less complicated I ask? Even Windows, which is pretty dumbed down to the average user, still mystifies a novice. It randomly stops working, a program stops responding, this happens far too often for the novice user. More advanced users know enough to reboot the machine when it locks, but a novice user doesn't understand why it stopped functioning, and this just increases the users fear of computer technology. To them, it shouldn't break! A 1969 Jaguar E-type is more reliable then windows! It amazes me that so many people use windows, after all the frustration of inconvenient malfunctions every user experiences. This is because of three main reasons, they aren't aware of the alternatives, they *think* the alternatives are too complicated or lastly, everyone else they know uses Windows so they *think* they have to too.

    Our entry into the mainstream market will be begun by someone (like Corel) providing a distribution that shields a user from the command line, yet still provides the bulk of Linux's power with GUI tools, and allows advanced users to access all of what makes Linux great. This is much sooner then you think. Hopefully before Windows 2000 hits the shelf. Next in the battle is mainstream apps, not just games like Quake. We need an app like Quicken (I would like this too, so I don't have to reboot too winblows), educational software, etc. Also, a unified installer would be nice. A lot of windows apps use install shield, a graphical version of a Makefile would be great. Anyway just thought I'd throw out some ideas.

    Spyky
  • This will keep people from modifying their system.

    As a joke I occasionally threaten to make a system called "Gnulix" which in my mind should be implemented with a Linux kernel, the TCL libraries, and a TCL intepreter.

    All of the system utilities will be written in pure TCL, and the shell will be tclsh.

    If I have such a perverted system I don't want some application installer bitching about it.

  • One distribution? Well, yes, the same way Debian is the only Debian distribution around? :-)

    --
    Matthew
  • I can see Micro$oft releasing a set of libraries and drivers such as to accomodate a "Windows on Linux" arrangement which they would sell, thus making Linux a "friendly" and sanctioned environment (but slower, of course) for their apps. Remember the genesis of Windows? -- a GUI on top of DOS. I'm quite sure they can detach both GUI and browser from the actual OS given sufficient fear and/or greed, despite the hype to the contrary.
  • There seems to be an assumption in the computer business that there has to be one winner. It just isn't so. Despite that some people (typically the PHB types) think it is desireable to not have any choice (because they fear making decisions), it is not. Choice is good. Markets can't be viable in the long run without competition. Any time that one vendor controls more than about 40% of a market, the whole market suffers.

    For me, Linux has already won. It works, it works well. I can work happily without going to something else. I don't begrudge *BSD, MacOS or BeOS or for that matter anything other than Microsoft from some market share. I'd be happy if the *BSD's saw increased popularity, MacOS was resurgeant and BeOS carved out a viable niche for itself. I don't even know if it is desireable for Microsoft to be completely wiped out (although occasionally that desire pops into my head). I'd be happy if they were beat down under 50% of the market so there were mainstream viable choices.

  • Sure there's nothing keeping a company like Microsoft from doing a linux dist with proprietary libraries,

    Correct me if I'm wrong, but I thought microsoft had signed an agreement with SCO or someone, that they would no longer produce an x86 unix. Of course Linux is not unix (technically), so maybe this wouldn't apply. Hmmm, of course we could always sponsor Linux for Unix compliance testing, I guess we would basically have to front the money. And when Linux passed it would MS-proof. Of course this assumes MS won't find some way to weasel out of the old agreement. Just my $0.02.
  • A weak Windows is our friend, do you think Linux would have gotten as far as it has if Windows worked as well as advertised?
  • Alot of people think the success of Linux means the doom of Microsoft. Not Likley. It just means they'll have to play fair again.

    I believe that the success of Linux means the doom of Microsoft, or at least of Microsoft as we know it. The success of Linux would force Microsoft to play fair, but that is not their core competency. Either they will make it a core competency right quick, or they will die.

    Either way, the customer wins.

    Apple had it wrong. They used to think that, for Apple to win, Microsoft had to lose; they were correct. They were correct because, for Apple (or Linux) to win, the customer pretty much has to win. And for Microsoft to win, the customer has to lose.

  • Most of that is simply because Linux isn't mainstream...yet. Not even the Open Source community can bother to write drivers for every piece of hardware out there(well i suppose we could, but who'd want to?). Anyone who's going to take the time to write a stable driver is going to do for a good device that they really love. That's why we have support for Voodoo 1,2, & 3 and TNT 1 & 2. Because it's good hardware. If your S3 Virge chipset isn't supported it's cause nobody liked it enough to put out a driver for it. Drivers for sub par hardware will always have to be written by the manufacturer, and they'll only do it when the OS is mainstream.

    *Except in rare circumstances that i'm sure someone will bring up.*
  • Money will keep Microsoft from coming out with MS Linux. Operating systems have been Microsoft's cash cow for nearly two decades. Microsoft is used to having that incredibly rich stream of income. They won't do anything to hurt that cash
    flow.

    Linux has already hurt Microsoft in the pocketbook - to the tune of millions, perhaps billions of dollars (remember the decision the Mexican educational system made to go with Linux instead of MS products)?

    If Microsoft introduced MS-Linux they would be supporting the competition. Every ten free downloads of MSLinux would represent a loss of X sales of Microsoft 98/2K/whatever. MSLinux would have to be very cheap to sell in any significant numbers. A purchase of a $50 copy of MSLinux might lose Microsoft a sale of a $10000 copy (with Cals, etc...) of WIN2000.

    MSLinux would also help "legit" Linux in a big way. Big businesses who fear to buy non-Microsoft-blessed software would rush to MSLinux (and away from Windows) in droves.
    Linux would finally have that penetration into
    the fortune 500 market! B-)

    Now wouldn't that be ironic?

    To add insult to injury MSLinux might mean Microsoft would probably have to contribute source code, which would (arguably...)
    make Linux more competitive with Windows.

    There's also the expert factor. Expert Linux users are likely to be ABMer's. When an expert Linux user makes a recommendation to a business customer, will s/he recommend MSLinux unless it was dramatically superior to all the alternatives?

    Microsoft *BSD is marginally more likely. *BSD hasn't taken off like Linux and will therefore be much easier to co-opt. However, there is still the money issue - every sale of Microsoft BSD will probably represent a lost sale of a much more expensive Windows product.


    I'm an ABM'er and I'm proud.
  • Like I said my mom installed Win95. And the printer. The network connection someone helped her with (its her work computer). No she couldn't do everything from day 1. She still can't. But she knows to hit f1 for help. My mo can't remember the "del" command from dos. She does remember to hit the del key in file manager to delete file.
    As for the computer crashing - it happens extremely rarely. Once a year maybe. When it happens its usually something I can say "Well yeah hitting ctrl-alt-del does reboot the computer" to. Occasionaly there is a serious problem. But she knows to save often. So she's never lost more than 15 min of work.
    As for reinstalling everything - ummmmm. I think that happened when she got the new computer. About 3 years ago. Since then.... No reinstalls. And she would've had to install Linux and everything with the new computer too.
    -cpd
  • That illustrates what tends to be the problem with the Open Source community, they'll produce the stuff that they personally want, but other things tend to be ignored.

    I disagree that this behavior is a fault. This behavior provides solid tools designed with the user in mind. This is opposed to the phrase "we develop with the customer in mind" meaning "marketing came up with some neat ideas they think will enable us to sell more product."

    Too often the commercial development world is being pushed by marketing rather than sound technical design. Granted, this implys that a company is listening to its customers. But often this actually leads to shoddy development. Even Microsoft insiders have complained that bug tracking and design have been sacrificed by last-minute implementations requested by the progect's Marketing department.

    Furthermore, it all depends on who is the one with the "itch". If it is someone wanting a "cooler" MP3 player (or front end to mpeg123) then that's what gets coded. But if it is a corporate interest that has a specific goal, THAT is what will also get developed. Word Perfect being ported to Linux is obviously an attempt to gain marketshare that has been savaged by Microsoft Word. Likewise, RedHat is not funding development for GNOME because the dart hit "GNOME" by random; RedHat needs a solid desktop environment to expand their market.

    But this is all software. Many of the issues you listed were hardware related. And there we run into a completely different animal.

    Microsoft does not develop drivers for specific hardware. If I'm wrong on that, please feel free to educate me. However, the understanding I have is that if XYZ Hardware wants their newest product offering to be profitable, they develop the driver for Microsoft Windows and offer it to Microsoft to be included in their distribution of the OS. Updates to the driver are published on media and offered for download.

    How those devices work is often a highly guarded secret. They are proprietary to say the least. And so if anyone is going to write drivers for an OS, its more than likely going to be the manufactorer and nobody else. Kudos to those who manage to black-box closed devices and get them to work under Linux.

    Developers willing to code for hardware devices will not solve this. Its the manufactorer's attitudes that will have to change. They're going to have to see economic incentive to support additional OS' (like Linux). For my part, I only buy devices that will work with Linux (and consequently tend to advise less-technical friends to buy those devices I know and like). I've even gone so far as to email a company saying "I like your product, but since I couldn't find it on any Linux compatability listing I will be buying your competitor's offering, which is listed, instead."

    The whole issue depends on economy of scale. As more parties become interested in Linux, the OS will do more. That's why I want to see it become popular. If Linux fails to acheive that popularity, I doubt it will be at the hands of its developers.

  • Mindshare is also a reason that Microsoft would likely do a *BSD based OS than Linux. They wouldn't really want to support/market another OS because they wanted it to succeed to the point of challenging their proprietary OSes. What they would want to do is splinter the open source OS market. By putting their eggs in the *BSD market they would divert a lot of mindshare away from Linux in a 'divide and conquer' sort of way. If they did a Linux distribution, then they would only add to the Linux mindshare.

    The *BSD license is also much more friendly to proprietary OS vendors (like Apple -- but also traditionally a lot of the commercial *nixes were either *BSD based (SunOS 4.x and older, Ultrix, etc) or contained a number of "Berkeley enhancements" (just about everyone else).

    Personally I think it is unlikely that either will happen anytime in the near future.

  • if i hear the words "linux fragmenting" once more i'm gonna scream! people miss the _very_ important point that unix fragmentation was as much, if not more, HARDware fragmentation as SOFTware. the software fragmentation followed because the unix players wanted to differentiate their products to help sell more hardware and protect their markets.

    sure, linux dists differ in flavour somewhat, but they are all binary compatible within a given arch which is an important distinction. heck, most other x86 unices can run linux binaries as well. with widespread binary compatibility like that, i just don't see how linux could fragment that deeply.

    and problems with things breaking between releases is hardly specific to linux. pretty much every OS i've ever used has caused headaches when upgrading to new releases. or, in the case of NT, even between service packs.

    i think it'll help having something like the LSB which will state in writing what "linux" is and what it includes, but even without it there's enough of a base and momentum that code forking of the level of the old unices or even the various *BSD species on x86 is very unlikely. and if M$ ever came out with a linux dist, that would be a clear sign that they're admitting defeat to linux's world domination.

    tim
  • I don't think the situation WRT Linux and mainstream users is as hopeless as you make it out to be. Here's why:

    Most end users when they get their PC don't want to have to have to learn commands, figure out short cuts. Then want to take it out of the box, plug it in and have it run.

    True, to a point. However, I think you've bought excessively into the myth of Positive Windows(tm) OOBE (Out-Of-Box-Experience). Linux installation is not that much harder than Windows installation -- it's just that 99.44% of computer users get Windows pre-installed. A pre-installed (or friend-installed) Linux would avoid that same pain. And have you actually watched somebody with no computing experience (or even no WIMP-GUI experience) sit down in front of a Windows computer for the first time? People have to actually (gasp!) read the "Getting Started" documentation, or else have personal handholding, or it just doesn't make sense the first time staring at the Start button.

    Also, in my experience, after initial familiarity is gained, many people start to graduate into the "power user" category. At which point, they end up delving into topics such as the Windows Registry, DLL incompatibilities, and the various "power user" tips and tricks of Windows. All of which constitute a formidible body of arcana, which is not made magically easier by the fact that it comes from Redmond, WA. Yet normal end-users tackle it anyway.

    You just point, click, and the application runs, you don't have to compile applications and then compile it.

    Funny, I just point, click, and the application runs, once it's been installed, under Linux as well. Just like in Windows. Except without the BSOD. :^) Surely you're familiar with the existance of GNOME, KDE, AfterStep, Window Maker, CDE, etc.?

    As for compiling, end users don't have to do that with Linux either. That's the whole point of Caldera, Red Hat, the Debian project, etc. Of course, if you want to, you can, but it's hardly required, unless you want to live at the bleeding edge of progress.


    "One smaller motivation which, in part, stems from altruism is Microsoft-bashing." -- Vinod Valloppillil (Microsoft), Halloween Document I [opensource.org]

  • Sorry, you are incorrect. Macmillan's Mandrake 6 DOES NOT require you to use PartitionMagic to install it; the CD is bootable and very easy to install w/o PartitionMagic.
  • I disagree. Microsoft can and quite possibly will create their own Linux distribution, if Linux gets to a point where it is clearly defeating Windows. (It's an open question of whether or not it will reach that point, but it could.)

    What does Linux have to offer that BSD doesn't? Mindshare. Linux already has more mindshare than BSD, and it's growing. (This may be unfair to BSD folks, but that's the way it goes sometimes.) If the public perception shifts to Linux over Windows, I would fully expect Microsoft to "embrace and extend" by offering their own version of Linux (to tap into the mindshare), and adding proprietary features via binary-only kernel modules so that other Linux vendors couldn't compete. Quite possibly, Microsoft applications for Linux would then require those proprietary features to operate.

    I think if and when we ever see "Microsoft Linux" or "Microsoft Office for Linux", that will be an implicit admission from Microsoft that Linux has won the war with Windows. No doubt Microsoft would try to spin it in such a way that many people would believe that they were always strong supporters of Linux. They might get away with it, too.

    Don't underestimate Microsoft; they're very good at what they do best -- marketing, and crushing their competition. It's too bad they aren't as good at actually making stable products...
  • >[...] but Linux DOES claim to run on Intel PCs

    And supports them rather well. As to some of the peripheral hardware that exists, that can be another ball game.

    I have a SC that won't work under Linux. I'm not complaining...I really didn't expect it to. After all, since no Linux drivers were available and the thing never did work right under Windows... Maybe I should have been happy that I could play the first 1/2 second of any wave file I wanted and nothing more (no, I'm not kidding, that's exactly what it would do.)

    As Linux becomes more popular, a number of these peoblems will vanish. Why? Realizing the lucrative market, hardware vendors will write Linux drivers and/or open their "propritary standard" or, best case, dump the propritary bits (assuming building a capable product is possible w/o their propritary bits.) Let's be real, most hardware makers don't make money of their software driver efforts. Generally, the drivers are free but only function with the device in question.
  • Does the end user need to learn commands etc in order to use Linux (or other *nix)? At one time, in the days when end users had terminals not computers on their desk, when they switched/logged on they were either presented with the application which they used or a menu to select one of a small number of applications. Similarly when end users first got "personal" (not necessarily IBM PC) computers, they were configured by the system administrators and (where I worked) the end users were again given a simple menu to select which application (eg Wordstar, Supercalc or dBase) to run. Should not the same thing apply to end users using a *nix system now, that the system will be configured for them (ie they don't need to know the shell commands) and all they do is run the applications?
  • Over the years linux developers tryed to make linux more and more mainstream and frienlier to the avarage user. I'm wandering why.

    Avarage users allready have an OS they are happy with, they don't care about which os they are using and they don't care much about freedom and "philosophy".

    On the other hand - until linux - programmers, hacker, and geeks didn't have a decent operating system to use (unices were expansive and non-pc)

    So we went out and built ourselves a great OS - only to watch it become yet another mainstream OS.

    Why do we need that? Why can't there be 2 OS's so each can choose the one that fits his need.

    I believe Linux should remain geeks oriented and free. It worked great so far. why not continue?

    I'm not saying we shouldn't make it friendly - most geeks appreciat a friendly system as well, it makes our lives easier and let us concentrate on our important goals. But lets keep in mind what makes linux unique


  • As Linux becomes more popular, a number of these peoblems will vanish. Why? Realizing the lucrative market, hardware vendors will write Linux drivers and/or open their "propritary standard"

    This will certainly happen. Creative has released Soundblaster Live drivers, for instance

    But wait till you see the community scream when these vendors don't release source code, in truth many won't be able to, even if they wanted to because of patent problems and what not.

  • I think we're missing something here in how the computer market will develop. Up until farily recently, we were in the "pyramid stage"--the newbie user (bottom of pyramid) learns an OS over time until they become an expert (top of pyramid).
    However, I think we're entering the beginning of an inverted pyramid model. Mainstream users want instand-on, blissfully simple to use, and the limited set of apps they run. There are already initiatives to make this happen to Wintel machines. As high bandwidth connections become prevalent, the mainstreamers will prefer to use more of a network computer-type box. It'll be plenty powerful to run their apps and have high speed Net access to allow them to run other apps on remote server or, if they're daring, download software. So, if newbies start in the model of the inverted pyramids, they'll tend toward the bottom of the lower pyramid if they go for the NC-type machine because it's easy, fast enough, and versatile enough for their needs or they'll tend toward the upper pyramid where they'll use a traditional computer for which they can upgrade hardware, they'll have a couple OSes installed, etc.
  • Then one day, a friend would offer to install some wicked free app that didn't come from MS, and it wouldn't work. And the MS Person(tm) would blame the developer of the program, not Microsoft.

    Not one application -- thousands of them will stop working. People have to from time to time download applications from developers' sites even if they are in distributions because distributions lag behind development, and if everything will stop working, distribution will lose its credibility instantly.

  • The difference is that if Mom wants me to figure out how to stop the spam from her mailbox, I have a prayer in Linux. I don't have one in Windows.

    I think Linux has some work to do in pacing the market, not simply following it. To a certain extent we have to teach them to want different things than they do now.

    In other words, show mom how to read her email, and promise that you'll be around to help her maintain it (remotely, even). And she won't have to search the MS knowledgebase or pay $95 for an incident report. If you're smart, though, you might want to charge a slice of apple pie per support call.

    In other words, Linux can already meet the needs of the mainstream, but in a different way that MS does.

  • About the driver issue:

    Oooh I was waiting for this! I work full time and have a family-- I don't have the time to write and maintain a driver. My wife gets upset when I spend too much time using the computer as it is.

    Anyway is that the answer we want to give the newbie who barely knows how to install Linux, and has probably never coded? Write a driver?

    No, that's not the answer. But let's look at it this way: you're talking about the symptom, not the problem. The problem is idiot hardware manufacturers don't provide specs to their hardware, and they don't provide linux drivers. I don't really care about them not providing their own linux drivers, but I really mind about them not providing specs. That is the problem. We need to apply pressure there.

    -k

  • Sun Tzu wrote:

    Windows only appears to be easy to use and administration free because of several factors including this naivete and, of course, pre-installation.
    Thank you. I don't know how many times I've said this before to everyone who says "But you can't expect the average user to use Linux--they couldn't even install it."

    Most "average" users couldn't even install windows. Really. I've seen them try.

    -k

  • UNIX is a brand name, someone owns the trademark/whatever. Linux is not UNIX. Evidently there is some sort of hoops we have to go through to actually be declared 'UNIX'. POSIX is more like an API. It's just a standardization. Just because you're POSIX doesn't mean you're UNIX. NT is (supposedly) at least partially posix compliant. Bwahahahaha. sorry. FWIW.
  • I purchased the Mandrake 6.0 distribution and yes it did come with BootMagic and PartitionMagic, but there is nothing that forces you to actually use them. In fact, I've set up a dual boot NT/Linux machine and I haven't touched either BootMagic or PartitionMagic. You can set it up to use LILO off of the HD or do like I did, and set LILO to boot off a floppy. If the floppy's in, then Linux boots. Pull the floppy out, NT boots.
  • I think another point of my last post could be the fact of updateing the kernel. Assumeing that Linux does make it mainstream and some company makes it their standard..Okay great, but lets say the linux distro they include on their systems is 2.2.5, well a lot of new users to Linux don't know how to upgrade their kernel, yes reading material is avaiable to use this. But most people I know don't read such materials. They figure "I can do this on my own with no ones help or I can't do it at all" so then someone else comes along and shows them how to do it. So I think we will see [if linux gets mainstream] users with older kernels, that don't know how to upgrade them. Also with the way kernels are coming out these days some users might find themselves with developmental kernels which to a new user could cause them some data loss. Also look at the way kernels are coming these days, 2.0,2.1,2.2,2.3 and now theres talks of 2.4, I really think that kernels are coming out to soon now. So I think that one thing that in the end might hold linux back is kernels (also lib compatability).

    Now I'm sure that comments will be made about what I just said with users with older kernels. The reason I see this as being a problem is that, like microsoft there is always someone out there finding problems and explioting those problems. So if a kernel say 2.2.4(just making something up) has a bug in it, well someone will expliot that user some time or another. That user might not know how to update to 2.2.10 (assumeing that has a bug fix). So we'll see that he becomes mad at Linux. Then is forced to either A) read and update or B) Buy a new version and reinstall
    or C) I know some people who would just get fed up and go get another OS completely.

    Another bit of my .02
  • But NTFS is faster, has much better recoverability.
    Generally Linux filesystem is one of the worst parts of whole OS.

  • I think that you may not quite understand, or that you're deliberately distorting things to force an issue.

    Those people with unsupported hardware either need to write their own drivers or ask others to help out. In this community people frequently try to fix their own personal problems, expecting that their work can be passed on to a silent majority of people with the same problem but no means of correcting it.

    I'm sorry that you don't have time to work on the drivers you need, but it looks like you've got better things to do, and that's okay.

    What's not okay is for you to bitch at other people, telling them that they need to scratch your itch before their own.

    If the community spent all of its time satisfying the needs of newbies, nothing significant would ever get done.

    The answer we ought to give to newbies is that they can rely on the kindness of strangers. But that that path is a bumpy one, since strangers don't have to be kind. It's better then for a newbie to actually learn something, shed their newbie status and solve their own problems. Who really likes to be helpless? Is it not better to become more self-reliant?

    There are a lot of newbies nowadays, and I think that we'd be better off if we made it easier to start down the road towards becoming an experienced hand, rather than to cater to their needs. After all, most newbies are helpless; they'll always have the same needs, and we'll all be bogged down in them. To constantly let newbies remain newbies is analagous to not teaching your kids to read, just because they don't want to.
  • That's the major reason that the KDE and GNOME projects are working on cross-compatibility issues right now. Over time, the experiences of these efforts will lead to a standard way of interfacing between desktop environments.

  • It is all marketting. It wouldn't be "Microsoft -Linux". It would be "Windows for Linux".

    "Support both Windows and Linux from your PC!"
  • What's not okay is for you to bitch at other people, telling them that they need to scratch your itch
    before their own.


    To clarify my post, All of my own hardware works under Linux, but that's because I built my own PC and only chose components that are known to work with Linux. I am not whining for drivers for my own needs



    Most people would rather buy a PC off the shelf of a Superstore, and are often stuck with whatever hardware their PC came with. If they want to run Linux they may be out of luck.



    I know that the Open Source community scratches their own itches first, this is understandable, it's human nature. However, there are people out there, notably ESR, who make Open Source out to be something that will solve any problem.



    If the community spent all of its time satisfying the needs of newbies, nothing significant would ever get done.

    Agreed, but then we need to stop pretending that Free Software/Open Source has all the answers. The reality is, if the Linux/OS community doesn't care enough provide something, companies may see the need and provide proprietary solutions, which will make the end user happy, but infuriate the Linux/OS community.

  • Ooooo, perfect opportunity for my story! My roommate decided to install Linux. This made Windows uphappy. So he moved everything over to that partition and tried to reinstall Windows. He therefore had the opportunity to install both within 24 hours of each other. The Windows install is easy because it makes too many assumptions. For instance, it assumes you want one partition (duh, we all knew that). Then it assumes you have a standard VGA video card, no sound card, and don't want to dual-boot. Practically, it makes sense to say that Linux should not assume all these things, since it's usually not installed on its own, but idealistically it should be equal with Windows. My friend was astounded that it didn't even bother to ask about his video card, so he went to the control panel, clicked on display, and then it had a kernel error and crashed! With nothing installed at all. It crashed. What kind of an installation is that?
  • If I have such a perverted system I don't want some application installer bitching about it.

    Installer should bitch about _particular_ modifications that are considered harmful (say, having major system utilities depend on COM support).

  • If the Windows hegemony can be attributed solely to the ease of use of Windows, then how do you explain the DOS monopoly that preceded it?
    For example, why was the functionally equilvalent DR-DOS crushed? Were its commands any more difficult to use than those of MS-DOS?

    Yer clueless, and you are recycling the same old FUD about how UNIX systems have no graphical interface. Your $0.02 is worth about $0.0002 where I'm sitting.
  • Only a Windows head would complain about too much choice. Its like complaining that there are too many brands of cars, and if someone was used to a Ford, they would be completely in shock if they had to drive a Toyota. Sure, some controls will be in slightly different places, and look a little different, but in a few minutes anyone with half a brain will have adjusted. I move back and forth between several different GUIs on completely different platforms every day, and its not a big deal. The only one I find really limiting is Windows. As for mixing KDE and Gnome apps, there is work in progress to make themability interchangeable between them, so your arguments may not hold any water at all in the future.

  • Yes it will take awhile to write the drivers. But USB looks like it will be a big part of the future of PCs and Macs, so if Linux is serious about these platforms, it cannot be ignored.

    USB has been around a couple of years, yet Linux support is just starting to materialize. NetBSD has had USB support for a year or more.
  • I have started using NetBSD, and have hopes of migrating all my Unix needs over to it, and away from Linux. The channels for communications about use of the system are just too clogged up in the Linux community.

    I've noticed more and more clueless posts appearing in the FreeBSD Usenet discussions, and am glad that I elected for NetBSD instead. It seems to be a more diverse platform (on tons of hardware types,) and a community that concentrates on the core value of a Unix system, which is NOT (IMHO) making it a general purpose platform for the masses.

    I am not saying this to pit communities against one another, these are just my observations.

  • "What if someone decides to do something you hate with your program, such as make changes that preserve compatibility with prior versions and break compatibility with your latest release, and then spend millions of dollars to promote their version? As long as they're in compliance with your program's license, you couldn't stop them. If a "corporate Linux" can guarantee better release-to-release compatibility, easier installations, and most of all, better usability, then the mainstream users will choose it over the "real Linux" by an overwhelming ratio."


    I said something very similar [slashdot.org] on Slashdot, a few weeks ago.

    If redhat^H^H^H^H^H^Hany distribution were to become a "standard", they could exist comfortably, due to the nature of the GPL. Sure, release your code, freely distribute your software -- but strike deals with companies to only guarantee compliance/compatibility with your software. You're still adhering to the GPL, but this still comes at the cost of the linux community as a whole.

    Consider this. M$ could easily package a linux distro, distribute it (even packaged) for free. They could port over all the MS-Office apps, etc, and a windows look-and-feel GUI. Before you know it, you would have many companies flocking to M$ to port their apps to MSL, the contract of which would include some sort of exclusivity, while still staying open. At this point MS would again have become some sort of standard, while adhering to the parameters of the GPL. The end question the linux community must ask itself then, is "what has been gained"?

    This isn't anti-MS talk, it's the realisation that any "standard" must exist within the linux community itself, and not with a corporation which has the power to srike exclusivity deals. This said (again), the whole Redhat situation will continue to concern me for some time...
  • Why is it the press must focus on overthrowing Microsoft in the mainstream market? Why does everybody insist that Microsoft is the evil empire and that Linux is their tool to defeat it?

    I use Linux because if I don't like it, I can change it. That ability will never go away. The whole concept of copyleft has changed all of the rules.

    Microsoft is NOT the enemy and it never has been. Linux will only die if the developers lose their love for it and leave. It might lose market share or gain it. It really doesn't matter to anybody except the marketroids. Linux will always be there and most of use don't give a damn what percentage of computers run it.


  • And HOW, pray tell, are you going to do this, when you distribute the source code? You can't prevent people from modifying your GPL'd source code. Haven't you ever read the GPL?

    I will just keep code with warnings on my site -- if everyone will do that, upgrades from developers' sites will make even the dumbest PHB think that there is something very wrong with that version if everyone calls it sabotaged.

    Alternative for offrnding company is to make their closed world, "port" large percentage of software and keep users from using developers' sites for upgrades, but at the scale of Open Source development it's impossible, and people can get very suspicious.

    Large-scale boycott however is very easy.

  • No, it is easy. Most people who get that error message will assume it was something they did.

    You or I might find the code, but noone said GPL had to be "good" code. It just has to warn that "bad versions of the kernel are in use" and offer to recompile with "good versions". I.E.: kill the MSFT "extensions" and replace them with good code.

    If they want they can comment out the code and rerun the make. But 90+% won't.

  • "Where are all the applications? In nearly all cases, same place you find Linux applications. FreeBSD has excellent Linux emulation."
    You see that as acceptable? You call emulating Linux an excuse for there being few applications. If such is the case, why make things difficult? Why not run a real Linux?

    "As for source-based applications, as long as the developer hasn't been sucked into writing Linux-specific trash..."
    Here is a big problem. Both camps like to accuse each other and bicker over licenses. Your above statement is the perfect thing not to say. Why not make informed and anti-inflammatory remarks?
    -Clump
  • Begging the Question is a fallacy in which the premises include the claim that the conclusion is true or (directly or indirectly) assume that the conclusion is true.
  • What we need is a GUI that is scalable in complexity. To some extent that already exists due to the ability to use different window managers, but since they are mostly all designed for the same audience they aren't different enough.

    Picture something like ICQ's Simple and Advanced modes, perhaps with a slider-bar somewhere on the desktop or even an agent that watches the user and tries to guess how literate they've become. When you're in the simpler modes, a lot of the features of the GUI are turned off (eg: you can't resize windows, there are no virtual desktops, etc.); the applications can read the level of complexity and use it to customise their own interfaces, particularily online help (eg: some of us can get what we need from man pages, some need wizards).

    In particular this would assist in the installation of new programs (which inexperienced users don't do that much, anyway), eg: the package manager doesn't ask people in simple modes where to install things, it just puts them in the default. Even when installing the OS there should be different levels of hand-holding.

    The way to prevent fragmentation for the sake of user-friendliness is to let companies like Corel and Caldera work on the Simple side and organisations like Debain work on the Advanced side. To this end a single distribution could have modes for different kinds of users, eg: Apple could provide short-cut keys, look & feel, etc. that would be friendly to a Mac refugee.

  • I actually don't think Linux needs more developers. I think Linux needs more good developers.

    Some time ago, while discussing about open-source development, someone told me it's better not to have some program if the alternative is having something that is not well implemented. And I have to say that he has a point there.

    It's not uncommon to see people that are pretty much about open-source on the Be Developers' List. This is something that always made me wonder, but I've come to realize that most people think (and in some ways are right) that most open-source software is not well done. But, because it's free, it'll take customers away from them. Is this fair?

    I think that before people should start coding they should think a lot about what they want to do. They should consider a lot of variables like UI, quality of code, documentation, ease of use, etc. Not just code, code, code. These are often overlooked in open-source software. I use Linux, but I'm finding myself more and more drawn to the BeOS because of its ellegant API, its speed (the actual speed and the perceived speed), and its good looks (even though Enlightenment has the potential to make things look better).

    Now, I'm not against power and configurability, but why must things often be done at the expense of usability? There are a lot of things that could make our lives easier. Who wants to be tweaking their OS to death? Well, I guess a lot of people, but I think a whole lot more prefer just to use it. And if we, developers, can, in the meantime, build tools that will make our lives easier, too, great!

    I think a lot needs to be re-thought about the ways open-source people work. We have the power to make things excellent! Why do we so often choose to make them average?

    So, I would dare say that we need less code. Given a choice, I would rather have an OS with its apps being slowly, but carefully, developed, than having half-implemented solutions that end up not serving anyone. I would rather have a good, say, text editor, than 5 average ones.

    Am I wrong in here?

  • If redhat^H^H^H^H^H^Hany distribution were to become a "standard", they could exist comfortably, due to the nature of the GPL. Sure, release your code, freely distribute your software -- but strike deals with companies to only guarantee compliance/compatibility with your software. You're still adhering to the GPL, but this still comes at the cost of the linux community as a whole.

    How so?

    I guess I'm shaky on the whole "distro-specific" thing. I know Codewarrior is supposedly released for RedHat Linux only, but what does that mean? Why couldn't I load it into a Caldera machine?

    Anything Microsoft releases under the GPL will, if it's good enough, be adopted by the other distros. If it's still proprietary code, then the people who want to use Linux aren't necessarily going to use it -- Apache will still beat a comparable MS product if I am required to pay for MS's product, even if the performance and stability are equal.

    Jay (=
  • Just as with those that would use Windows, the average user will always be a bit behind the cutting edge...I really dont think that Linux development is going to suffer because of it...those that would use the bleeding edge are already, those that wouldnt arent...no big deal...

    I think if Linux can make absolutely everything point and click and really get up to speed with the latest hardware, so the average user can really plug and play with the penguin, then the future will be bright....if not, well then I guess it will continue to be just those of us that have been ENLIGHTENED...

  • Agreed on the debate tactic comment there. Just, when making examples about technical issues, remember that UNIX-style OS's often have 1,001 ways to do the same thing. Heck, customizability and flexibility are the things I love about this side of the OS fence.

    The main reason that FreeBSD is so source-centric is that the source is constantly being updated. This isn't to say that it's necessary to actually track -STABLE or -CURRENT regularly. I know many people with boxes still running off of 2.1. Not because they couldn't update them, but it's stable on their setup, and why fix what isn't broken? And for those of us who like to play with the new toys and features, the CVSup servers make it a breeze to keep our /usr/src up to date. And when
    there's a bugfix, you'll get it as soon as you cvsup.

    As far as non-OS stuff like gimp, it would be simple enough to write a script to check ftp.freebsd.org's package repository, check the version number of the specified program against that which is listed in the installed package database, and if it's newer download and pkg_add it. In fact, it's probably been done already. All binary, no time for compiling.


  • Most people would rather buy a PC off the shelf of a Superstore, and are often stuck with whatever hardware their PC came with. If they want to run Linux they may be out of luck.

    Agreed, but then we need to stop pretending that Free Software/Open Source has all the answers. The reality is, if the Linux/OS community doesn't care enough provide something, companies may see the need and provide proprietary solutions, which will make the end user happy, but infuriate the Linux/OS community.

    Well, like I had said, I think that the best thing to do is to encourage people to want to learn about things, including the details on their hardware, and how to support it if it isn't already. I don't think that OS is necessarily going to provide all of the solutions, although certainly the more people that really use it, the better it'll work.

    As for commercial software providing solutions, I don't really think that this is as likely as you do. Most companies are impatient. Rather than go for steady, modest profits, they frequently go after big markets and big profits. Think of the difference between a small software house and Microsoft. Any niche big enough to support a generic, impatient company is likely to have a group of OS developers already. The tricky bit, we agree, is getting in between the cracks, and not very many companies are going to want to do that.

    Too many people want to Make Money Fast, and since a lot of them are in management, they can frequently give orders to programmers to ignore the vital cracks and go for the high profile stuff.

  • The article raises some good points. Some Linux developers get all up in arms with the mention of things like "binary-only kernel module". Yet the Linux community has so far not delivered on many of the things it needs.

    Some examples:

    USB? If you're lucky your mouse will work.
    PNP? Linux makes this more difficult than non-PNP
    Parallel Port Scanners? Forget about it..
    3D? 3D works great... IF you have the right card

    But if you need an MP3 player, you're in luck! Freshmeat lists 77 entries under Mp3.

    That illustrates what tends to be the problem with the Open Source community, they'll produce the stuff that they personally want, but other things tend to be ignored.

    Don't get me wrong, I love Linux, and use it for almost everything. I do get frustrated at times because I need to reboot into that other OS to accomplish something.
  • One that supports links, isn't plagued by the archaic and irritating 'drive letter' scheme and that doesn't do funky things with mixed case filenames.

    NTFS is saddled with major annoyances in order to make it be able to coexist with regular Windows (and even MS-DOS) applications.

  • You have made a good point but you have overlooked a couple of things. Windows users will gradually grow more sophisticated over time. The strategy of pretending that administrative issues don't exist so that you need only be a "user" of your machine will grow weaker over time as Windows users gain experience. Windows only appears to be easy to use and administration free because of several factors including this naivete and, of course, pre-installation. As users become more knowlegable and as other systems are offered preinstalled that boot straight to GUI's Windows will lose this advantage.
  • by twit ( 60210 ) on Wednesday August 04, 1999 @05:25AM (#1766897) Homepage
    This begs the question: what is it to win, and should linux strive to win at it?

    The implicit assumption is that linux will become a mainstream, Joe Lunchbox operating system, to the detriment of Microsoft (Apple, Be, etc).

    I really don't see the advantage in that. It'll certainly up the demand for commercial applications, and corporations will move to fill that demand. Is that what we really want?

    So many linux users (both way back when and now) couldn't give a rat's ass for free software; they want applications. I'm afraid that if we court the commercial market too strongly, we'll lose, not gain, developers - that great mass of developers who create free software.

    What linux should strive to gain is more developers and especially more free software developers. Commercial ventures seeking to distribute non-free software for linux should be given a run for their money; they should be pressed to advance the state of the art as fast as they can. The point is that we, as a community, don't need more users; we need more developers. Linux users should aspire to become developers, and existing developers should aspire to use their skills on Linux. If we slowly give that up and let corporations handle it for us, then we're also giving up control of Linux, GPL notwithstanding.

    --
  • But who set it all up? I have seen my mother install win95. And M$Office. Yeah she just hit the typical install thing. But it was simple.

    I installed RH5.2 about 6 mos ago. I had no problem. I installed KDE and Gnome. Fairly easy. Thing is I know basic unix stuff. My mother could not have done it.

    With linux you still need to know commands. With Windows you don't. Although you should. I'm sure my mother would be uncomfortable in linux, and therefore would not use it. Yes KDE and Gnome are easy to use. But you still have to know stuff. Most people already know Windows.
    Why would they learn something else?
    Cuz it's more stable? - nah. Mom's computer rarely goes down and then its an exscuse for coffee.
    Freedom(speech) of Code? - nah. she could care less.
    Free($$$) of code? - That she might bite at. But she'll probably pay for something familiar. How much will you pay for home cooking like mom's?
    Cuz she can run a server? - Mom's response - Whats that?
    I still argue that linux's main place is in the server arena, in the hands of computer people, and used by people willing to learn. The masses tend not to want to learn.
    -cpd
  • by Jburkholder ( 28127 ) on Wednesday August 04, 1999 @05:29AM (#1766923)
    I agree with your points, but not your conclusion. (just realized I was replying to the reply, this actually goes to the original post).

    Never say never.

    The points you bring up are certainly going to make it much harder for Linux, or any *nix-like OS, to become a 0-skill, entry-level, first day OS. Look at DOS and Windows. Same old DOS still sits under W95 and is still just as ugly as ever. The new user is shielded from that.

    To do the same to Linux is probably not impossible, though not very desirable. My opinion is that Lunix is moving in the right direction to make it more accessible for the novice enthusiast, but will/should probably stop short of the "mainstream" user who has no computer skills and does not want/need them to do their job.

    I'm a big fan of choice. I'm not eager to see Mickeysoft wiped out completely, just put in their place. I can see a world where Linux is a powerful, well entrenched, well supported platform for whoever wants to use it for whatever use provided they have the knowledge to put it to work. And then alternate, less powerful, less entrenched equally well supported platforms are available for those whose skill sets are outside of computing who want to have an easy to use tool to make their work easier.

    My $.0199967352
  • Fragmentation IS a concern, and it may already be happening. In the article on here yesterday or Monday about SGI's new Linux servers, I pointed out that they specifically mentioned two specific changes to the kernel -- a replacement TCP/IP stack and more efficient NFS code.

    I asked if anyone had seen a mention of these on the kernel mailing list. There was no replies suggesting that there has been any contribution by SGI to the stock kernel of those technologies.

    So there's a bit of fragmentation there. What happens if software starts needing a particular NFS implementation, or a particular TCP/IP implementation. Already you see packages coming out for RedHat that are a bitch to install on non RedHat systems. Hell, IBM's viavoice stuff they released a while back bombed out on non-RedHat 6.0 systems.

    As time progresses this is only going to get worse. The work being done to define a standard base of libraries doesn't enforce application vendors to build against only those libraries -- particularly since (AFAIK) the compilers usually link against the most recent ones on the system, and most if not all distributions will have more than just those base libraries on them.

    Under Windows you occasionally see that problem too, with differing versions of DLL's getting mixed up, overwritten, or lost and the general instability that results. What happens when more applications start coming out requiring versions of the system libraries that aren't necessarily compatible with each other? Are we going to end up with 100 meg of differing libc/glibc libraries on a Linux system? Even with a system like RPM, that's a real pain to keep track of, particularly when you're building new software.

    Has this been addressed by anyone? It'd seem its an issue for things like Gnome and KDE. If its March 2000, and I've recently upgraded by Gnome and now have GTK 2.0 on my system, will the binary of Communicator 5.0 I'm running still work? Switching to GTK 1.2 broke most of the older apps I had.

    Using the typical is-linux-read-for-the-masses example, would my Grandmother be able to straighten out that mess? Hell no!

For God's sake, stop researching for a while and begin to think!

Working...