Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Linux Software

The Short Life And Hard Times Of A Linux Virus 191

Sun Tzu writes, "There are several reasons for the non-issue of the Linux virus. Most of those reasons a Linux user would already be familiar with, but there is one, all important, reason that a student of evolution or zoology would also appreciate ... The article is at sitereview.org. "
This discussion has been archived. No new comments can be posted.

The Short Life And Hard Times Of A Linux Virus

Comments Filter:
  • by Anonymous Coward
    root is always expected to be UID 0. None of the code I've looked at ever does a check for this (with getpwnam() or whatever) but just assumes this is a fact. So there's quite a bit of hardcoded stuff as far as that's concerned (grep for 'setuid (0);' and see how often that comes up...).

    Anyway, a virus doesn't need root access to propagate, it just needs to propagate *somewhere*, whether that's your files, or some other user's files, or even over the network.

    A lot of people here seem to believe that Unix file permissons will greatly hamper a potential virus, but this is dubious in the case of the typical Linux installation (ie, a user's desktop machine). Quite a bit of Linux users actually administer their own machine, and thus tend to switch around between UID's (with 'su'). If a virus is written with the capability of watching for and acquiring passwords (as the user types them), then file permissions are no longer a barrier.
  • This describes the typical Unix situation, which is not the typical Linux situation. There, more people have installed their own system and have root priviliges.
    This is repeated frequently, but does anyone have proof? The most "typical" installation, especially for newbies, is RedHat, I believe. RedHat's installation procedure makes it difficult to get started without creating a normal user account, and makes it uncomfortable to use a root account for normal work. I don't have any proof that Linux installations are as often or more often set up correctly than most Unices, but I don't have much reason to believe the contrary.

    I am still afraid that I come into a Makefile someday that holds the line: install: rm -rf / Is this not a virus?
    Your example is in no way self-propogating. It is at most a Trojan horse, not a virus. From the Jargon file:
    virus n. A cracker program that searches out other programs and `infects' them by embedding a copy of itself in them


    --Chouser
  • Point taken -- I should be less confident. In fact, I was already, but being in a sort of Slashdot swagger mode, I felt the need to emote extremism. *sigh* Ah, well.

    But this wasn't my main point. I will concede that 1) mutt has more holes that I'm aware of, and 2) my system is suseptible to much more than just strict "Melissa-style" viri. My main point, however, was how tempting it is to build holes into higher-level apps. I suspect that security is a significant concern in most open-source network and operating system projects. But I'm afraid that high-level application projects, such as Office clones, tend to not worry about security much. It's for this reason that I bothered posting at all -- we need to make sure that newbie-targetted "productivity" apps don't come with huge holes built in.

    --Chouser

  • Heh. I wasn't slamming RedHat. Obviously I didn't make myself clear enough.

    Being root most of the time while using a unix system is a Bad Thing. Having a unix system with only a root user (and no 'normal' users) is a Very Bad Thing.

    I have installed both RedHat 6.1 and RedHat 6.2 beta a couple of times each in the last few months. Therefore I'm quite aware of how the installation procedure goes. As I said in my last post, RedHat makes it unhappy for a newbie user to do the Bad Things named in the above paragraph.

    This means that I am happy about what RedHat has done with their installation procedure -- I think it is a Good Thing.

    --Chouser

  • >RedHat's installation procedure makes it difficult to get started
    >without creating a normal user account, and makes it uncomfortable to
    >use a root account for normal work. I don't have any proof that Linux
    >installations are as often or more often set up correctly than most
    >Unices, but I don't have much reason to believe the contrary.

    This changed with RedHat 6.1. You create a user account along with the root account during the install process. If you're going to slam Redhat for something at least bother finding out something about it first.
  • No, it is not just as simple. You need to not only download in advance the rpms you want, but also all of the dependencies. If you skip the dependencies and rely on the "--force --nodeps", don't expect the installed packages to work.

    Or are you suggesting that one do this on the RPMS directory on the CD? What if one does not want to install all of the packages on the CD? What if one does not even have a CD?

    Furthermore, why would you use "--force --nodeps" to begin with? The only legitimate use I can think of for that (in the absence of broken packages) is if the dependencies were met in a way that rpm does not know about, such as installing RPMs on a Debian system, or software that was compiled from source.

    How are either of these methods simpler than just typing "apt-get install <list-of-packages-you-want>"?

    --

  • I'm just saying that sometimes I get tired of having to install forty-eleven new packages just to get a damned ICQ client to run.

    Try Debian and apt-get.

    --

  • No, on my distro (Debian 2.1) the su command alters PATH even when not run with the '-'. (I checked it out before I wrote the post above.)
    --
  • because a virus run as non-root can probably only write somewhere within the home directory tree of the user (~ or ~/bin, etc.). The su (and similar) commands alter the PATH environment variable to point to only known good (secure) system executable directories, and to specifically exclude '.' from the path. (That's why you have to say ./cmd to run cmd in your current dir while you are su'ing). It guarentees that the version of 'ls' that you invoke is /bin/ls and not the possibly viral /home/luser/ls.
    --
  • The article keeps the assumption that a Linux virus must (like on Windows) infect binaries. This isnt true, it could as well copy itself into some place of the user's home directory (for example, somewhere deep in the ~/.netscape directory) and then start itself from a file like .profile, .bashrc or .xsession, assuming that the naive user will never look at them.

    Affecting a binary doesnt make sense on Linux, as users almost never distribute them directly - if they distribute software, they send source-tarballs, RPMs or DEBs. And the RPMs are the real weakness: a virus could infect the RPM to spread itself. The solution for this problem would be the use of digital signatures.
  • "When Linux makes it to the desktop I see a strong possibility that installation programs will run under any user acount but will require a root login for the actual install."

    The only reason the install process should need root access would be to write to /usr/local/bin and /usr/local/lib, or whatever.

    These directories could (should, IMO) belong to some other UID, like apps. In this case the install would only need the permissions of that UID.

    This would limit the damage that a virus could do to wiping out the applications you have installed, but not being able to touch the base install.

    The distributions should be putting more focus on setting up stuff like this. There should be intermediate security levels instead of just user and root.

  • This describes the typical Unix situation, which is not the typical Linux situation. There, more people have installed their own system and have root priviliges. And the less savvy the user, the bigger the chance that the root user is the only account on the system.
    True but as linux moves ferther in to the home I think you will see less people who even know what root is letalown use it. look at how corel linux is set up. Secndly for bissness users the opsit will be true. I know the TI deparments are druling over the power thay would have if linux/unix was the desk top os of choise. It should also be noted that bissness users are the most impacted by viruses.

    Your mak example is a jake. It is nothing like a viruses. it is not a infected file is just a bad program. Ferthere not only would it quickly be ditected it would only effect the retivitly small (and becoming smaller) number of users who aculy compile there programs instead of just using a package. (packeages have check sums and could, and should, be signed to prevent infection on them; as well as the fact that most people get packages of CD's)

  • > install: rm -rf /
    > Is this not a virus?

    I would call it a Trojan. One of the purposes of a virus is to spread itself. This would have been detected the first time someone did a make install. If it instead infected all Makefiles on your system with itself I would call it a virus.

  • Well, when I run make -n on software, I'm actually not looking for boldfaced 'rm -rf /' commands anyway...I'm looking at things like what libraries the thing is building, directories that it proposes to install in, and stuff like that. If I were searching for a deeply nested 'rm -rf /' in a suspicious makefile, I would not resort to 'make -n' to find it. I would put a wrapper around 'rm' and trap the call.

    But, there you go. 'Suspicious makefile?' Since when? As another poster pointed out, if somebody tried to slip a mickey like that through the distribution system, it would get pulled from freshmeat immediately. It only has to crash a couple of victims, and the word is out.

    Paranoia is useful, and in my business, it's a job qualification. But, as you mentioned, we can't manually verify everything. So if paranoia is not to become paralysis, what do we do?

    "We can check if we really want to" is actually a mighty shield, one which is really only available to open source users. There may be an ambiguity lurking in your statement of this principle. "We" do check everything...I don't personally do it, but it is done all the same, by the community.

    Do you remember the tcpwrapper flap? Somebody posted a patch to the code at a primary source site (U. of Eindhoven, was it?) which snagged security info and mailed it to a Hotmail account. That was discovered in short order, and the news was everywhere. The crocked tcpwrapper was pulled, the Hotmail account was canned, and everybody was agog for maybe 2 weeks. "How could something like that happen in Open Source!" But, you know, that whole affair provided a demonstration of the hostile environment that open source software provides to malicious code, whether it be a virus, a bomb, or a trojan.

    No, you don't want to blithely trust everything, but you don't have to rely only on your own powers for safety. By myself I could no more close every possible loophole in my system than I could write a kernel to compete with linux. But, just as I benefit from the work of a community which provides the OS that I'm running, I can use the eyes of the community to watch out for those pesky 'rm -rf ' things too.

  • Technically, that make target should be:

    install:
    rm -rf /

    That's not a virus, it's actually a bomb. The thing that defines a virus is that it replicates itself manyfold before destroying its 'host' (if it ever does so). The point of the article is that the linux installed base is too hostile to virus reproduction for them to become a major threat.

    Btw, if you're worried about a makefile of suspicious provenance, just say "make -n " and check out all the commands it wants to run before you execute them.
  • they have already released a virus scaner for linux. Now they're just waiting for a couple good viruses...

    gg
  • With the onslaught of new users working solely as root, many of these virus threats could become an issue, at least for those users. The more experienced user understands that root is reserved for administrative tasks, and should not be used for menial, day to day tasks. The newbie doesn't understand this, and thus is vulnerable. Moreover, A crafty virus coder could have the virus exploit a root hole in something like the recent PAM exploits and then wreck havoc as desired...

  • I just dont understand that statement. Could someone expand on it?

    With open source, many people (though not necessarily the end users) thoroughly read the code. If there is anything suspect, many people will notice it and spread the word. With many people collaborating on projects it would be difficult for anyone to slip something bad in unnoticed.

  • Most Script Kiddies work from Windows boxes and spend their "productive" time (and I use that term loosely) in AOL chat rooms.
  • Um, I think the original poster may have intended his statement to be more of an epigram than a statement of pure fact. Of course, maybe I'm wrong and he doesn't know anything about the history of virii on UNIX...
  • I am still afraid that I come into a Makefile someday that holds the line:
    install: rm -rf /
    Is this not a virus? If not, why is it a virus if a similar line is contained in some malicious Word macro?


    Well, if you want to split hairs, the rm -rf / contained in a Makefile would make that Makefile a Trojan Horse, not a virus. The key definition of a virus lies in that it duplicates itself.


    Quidquid latine dictum sit, altum viditur.
  • The only thing that I can think, is that the author assumes that everyone reads the source, and waits until they grasp every nuance of the code, before installing it.

    I do concede that having the sources open, does mean that once someone does look at it, they would most likely make efforts to educate people to avoid that program. But if someone sent me the source for an app, with a virus written into it, I'd never know.
  • I wonder if the rarity of Linux viruses and trojan horses is more a matter of user culture than inherent technical security. Linux users care about their operating system. Windows users, on the other hand, don't generally feel a great loyalty to their platform, and so the more adolescent and malicious among them feel no compunction in compromising it.

    As a supporting example, consider the Macintosh (which has a similarly loyal user base, and probably a larger one than Linux). Macintosh viruses are pretty rare also, and it's certainly not because of any inherent security of the MacOS. (One pops up every once in a while, like the Autostart virus a year or two ago, but they get stamped out pretty quickly. The last time I saw a Mac virus on my own machine was more than ten years ago, and I have downloaded gigabytes worth of shareware binaries, etcetera.) (And then there are spillover viruses, like Word macro viruses infecting Word for Macintosh, but these are the exception that proves the rule.)

    I just don't buy the argument that Linux is immune to viruses, especially since a large fraction of Linux users have root access. (You don't have to be completely clueless and log in as root all the time, either...imagine a virus that installs a replacement alias for 'su' in your shell, so that it gets root access the next time you do some administration. One can come up with countless other attacks.) (Also, I would consider self-propagating DDoS scripts like Trinoo to be viruses.)

    Certainly, the lack of Windows "features" in GNU/Linux software helps security, but it seems a far cry from a complete explanation of why we see so few problems.

  • Then it becomes confidence, and pride no longer requires running as root always, just to tweak a config file sometimes.

    I have never had to spend much time explaining to any NT user why su is such a powerful tool. It is a key usability difference between windows and unix. One rarely runs as root under unix because you have the ability to su whenever you need to tweak that config file. NT requires you to logout; login as a privilaged user; preform prilaged task; then logout; login as a non-privalaged user. The whole proccess takes at least 10-15 minutes even to do something trivial like add a local printer. If your lucky you don't have to reboot (W2K much better in this regard.) So when ever possible NT users run with local administrator access all the time. Win9X has root as the only login of course.

  • Yeah right, so first it says that high level scripts may be a source of viruses, but then when you have source code (in e.g. Makefiles, highlevel), viruses are all of a sudden less likely. I am still afraid that I come into a Makefile someday that holds the line:

    install: rm -rf /

    Is this not a virus? If not, why is it a virus if a similar line is contained in some malicious Word macro?

    That's not a virus, that's a trojan horse. Viruses replicate. Trojan horses are nasty programs disguised as legitimate ones and do NOT replicate.

    Please take the time to learn the terminology before posting things like that. If it's of any help, I have a collection of many anti-virus FAQs here [claws-and-paws.com].

  • ... skr1pt k1ddi3s.
  • Well, keep writing letters to Loki, and maybe your favorite Windoze virus will get ported. Of course by then, it will be last years's virus, but you take what you can get...


    ---
  • Yes it's true this will attract some crackers who want to prove they are hot stuff and some who just want to "disprove the hype"..
    However I'm more fearful some terrorist will find a way to attack the United States millitary than I am worryed about some cracker writing a Linux virus.
    [I am not even slightly worryed about such attacks.. I am worryed that some of our people may get hurt in attempts but thats all]

    Viruses like the Y2K bug are not magical and do not do magical things.
    They require one thing.. unrestricted access. On Linux this means root. If your really paranoid there is allready a project to prevent anyone from modifying programs on the system even as root.
    Linux allready provides this protection to some degree. Some programs when running can not be modifyed by anyone... piriod.

    Viruses remind us how importent security is for EVERY system we use. They are not a given. They are posable if you get stupid. Otherwise they are a non-event.

    On the other hand if a virus can get root so can a script kiddy and there is a lot more a random script kiddy can do to your computer than any virus could ever do.

    Yes I grow tired of virus warnings.
    Linux has been around long enough.. if a virus epedemic were to happen.. it would have allready come to pass...
    Viruses will be writen and they will be crushed. Being fearful of what dose not as yet exist dose nothing.
    Let us cross the bridge when we come to it...

    When virus experts want to clame the big "badass" Linux virus is comming it is up to us to crush this myth.
    There is no Linux virus epedemic comming. There is no baddass virus.

    No Unix or Unixlike system is anywhere near as suseptable to viruses as Dos/Windows this is a fact.

    If we don't stand up for Linux now and let the general public believe the Linux virus epedemic is comming we set ourselfs up...
    Unix venders will not hesistae to take Linux down a peg.
    And Microsoft would have a spin...

    Linux didn't get this far by being fearful something "horrable" might go wrong...
    Linux is stable.. we premote this.. Occasionally Linux boxes crash.. this hasn't hurt us...

    What could be worse than premoting Linux as "User Friendly"?
    There will be viruses and we will deal with them.
    But don't pretend this is an eppedemic...
    An eppedemic is a 10 year old operating system running 20 year old viruses... (Windows running Dos viruses)... never being able to do anything to stop viruses.
  • Being realistic...

    Viruses are not magical...
    They work becouse the computer trusts any given program to "play nice"...
    Linux dosn't....

    Linux viruses exist.. they are dead...
    Linux has allready tempted the crackers.. We allready premote Linux as secure...
    Linux is allready high profile.. we are the "challanger to Microsoft".. on the news..
    Hidding under a rock is nither practical nore posable...

    True viruses will be made and they will be crushed..

    "Anything is posable" true...
    But many things are incredably unlikely... this is one of those things...
  • We allready had the "running as root" problem.. Thats where the first Linux virus came from.

    It will be yet annother thing stupid newbes will do and they will get bitten quickly...
    Not nessisarly vea a virus...
    But this one ranks up with turnning power off before shutting down... and trunning on telnetd with no root password...
    Or.. and the all time favoret... entering commands someone gives you on IRC...

    As for the e-mail virus myth... It would still be a myth if Microsoft hadn't tryed to tie everything into e-mail...

    The whole notion that a virus could be containned within text is still silly...
    Better watch out.. this post might contain a virus.... ohhhhhhh
  • Installed Mandrake 7.0-2 with defaults, and can't log in as root *EXCEPT* if you let it start X when you boot.

    The unsuspecting newbie will probably always tell it to start X by default, so the problem still exists.

    Haven't tried 'paranoid', so I don't know if this behavior is the same there.

    Iceaxe

  • This article is true, but my concern is that as we continue to read how virus-hostile Linux is, we'll begin to feel smug and overconfident. That is the first step to disaster. Let's not forget what got us to this point in the first place: good design mixed with a bit of healthy paranoia. We need to never relax, the virus writers and crackers are out there and are just waiting for an opening. Yes, we've done a good job so far, but don't get complacent.
  • This is why most unix systems don't include . in your PATH. Unless you've helpfully short-circuited this security precaution then you have to type ./ls to run a bogus local version of ls.
    +++++
  • That's not the only way - try Debian, you might like it too :)

    (RPMs should jolly well say what the required dependencies are - and if you don't have a package of that name and/or version to match you can override it with rpm --no-deps, of course. Even so you probably shouldn't if you're going to keep your machine clean.)
  • I'm not saying that some programs you want to run don't work, I'm just saying that sometimes I get tired of having to install forty-eleven new packages just to get a damned ICQ client to run.

    Maybe this is part of the reason why viruses find Linux such an inhospitable environment. Most Windows boxes have a common set of code running on them. On a Linux box, a virus can't assume anything--there are many kernel versions, many different shells, mail clients, etc. Libraries vary from machine to machine, if a virus needs a certain lib to work, that lib may not even be installed, or it may be the wrong version.

    For what it's worth, if you can't get a package (RPM, whatever) to install because of dependencies, you can always download the source and build the program yourself. Package managers expect to find specific library versions, but the build system included with most GNU and other OSS does a little bit more work to find the libraries the code calls for. Often, when you run the configure script, if you don't have a required library, or if you have the library but it's too old to work, you'll get a nice message explaining that the lib needs to be a certain rev or later, and maybe even a URL for the latest version.

    I rarely use RPMs anymore, simply because it's much easier to build the programs I need myself. Try it, you might like it.

  • That only works if /usr/bin comes after $HOME/bin in your path, which it should not. Any paths that are user writable should be at the END of your $PATH, including ./


  • (Though obviously this would be pretty easy to spot if you were paying attention. But would you notice something called "vi" in your home directory?)

    On a properly configured system, it's not a problem. The search path should NEVER include current directory, and if you have a ~user/bin, it should be last. In that case with your scenario, the real vi will be executed, and the fake one will just gather dust until you notice it's existance and rm it.

  • Then they will learn. Not a nice lesson.

    That reminds me of a newbie howto I once read. It went step by step through installing Slackware. It was written for someone with no Unix experiance.

    Step ten: Your Linux system is now installed. Still running as root, type cd /; rm -rf.

    Step 11: Now you know why you should never just type what someone tells you to when you are root unless you KNOW what it will do. Go back to step 1.

  • One of the points mentioned is that under Linux, most people download and compile source code rather than fetching binaries. Is this still the case? I suspect that many people these days would download RPMs and install them (as root, nonetheless!) instead. Theoretically, sneaking an infected RPM for something cool/sexy (a first-person xbill variant or a Star Wars screensaver should do) onto a contrib site could infect a lot of systems as root.

    Of course, most RPMs are downloaded from a central server, not traded or swapped on BBS-like local sites, which makes it harder. Such a RPMed exploit could possibly do other things, such as dynamically patch files sent by ftpd/httpd and infect any executables (standalone or in .tar.gz) sent. Or one could take a leaf out of Ken Thompson/Dennis Ritche's book and modify the C compiler (or linker) to insert extra code.

    Or one could just be unimaginative and modify tcpd to contain a remotely-activated 'sleeper' denial-of-service client or backdoor root shell.
  • And yet methinks perhaps the article should have been moderated "Reduntant". This is all things we have heard before. In fact most of it sounds word for word exactly like other "linux virus" posts we've seen here. We all know it's harder to infect Linux/Unix, but they are open to other, more isidious ailments. The "Great Internet Worm" didn't infect windows machines, hell there were NO windows machines when it came out, and it brought the net to it's knees. Every environment has it's weakness, viruses just happen to NOT be one of Linux's.

    eh...that was rambleing...note to self, don't take support calls and post to slashdot at the same time...

    Sgt Pepper
  • You say that Mutt makes you immune from a Melissa-style virus. All the Melissa approach needs to succeed is to trick enough users into running an executable so that it spreads faster than it dies. So all I have to do is to compose a message that will trick, say, 1/10 of the Linux users into running it, if on average each execution will send out more than 10 copies. The program would search for your aliases, as stored by mutt, elm, Netscape's mailer, or whatever, and send them all a message.

    If a message that appeared to come from your best friend (and, in fact, it would be from your best friend, if he were suckered) told you to run a program, would you run it? If so, the Melissa approach would get you, whether or not you use Mutt.

  • >My. arent WE superior? Let's just classify
    >EVERYONE who writes code for ALL flavors of M$ OS's as ignorant savages.

    My, did I hit a nerve here? Are you a Windows programmer?

    >After all, it's self evident that if you don't design single user apps for single user machines
    >as if they were meant for multiple users on servers you MUST be a moron.

    Not my point. There are some nice single-tasking OSes out there -- Palm OS is one that comes to mind. Straightforward, doesn't leak memory. Nice work -- especially when you consider the OS was written by a handful of people while Microsoft was throwing dozens of people at their WinCE project.

    But the Palm OS is designed to run for months without a reset or reboot. So you can't have memory leaks.

    I'm talking about Windows NT/2K -- last I heard, MS said it was a server OS. Servers run multiple processes for multiple users, so I'd assume that this environment is multi-user & multi-tasking. But then, I think I'm superior -- according to you, & MS software shouldn't be expected to do so much.

    Why MS can't write a reliable OS -- or applications for one -- with all of these qualities baffles me. They had access to the technology -- they wrote Xenix, & Dave Cutler was the project Manager for VMS before he led the NT group. They have the money to hire good programmers with experience in this kind of environment. I would think they could make NT just as reliable.

    And what ought to stick in the craw of any Windows programmer is while the coders at Redmond are being paid to do it right, a bunch of amateurs without access to the technology figured out how to do it in their spare time. Based on these facts, I'd say that writing a reliable multi-tasking, multi-user OS is not rocket science any more. So why CAN'T Microsoft write better software?

    >All I can say to that is: You flunked engineering economics, didn't you?

    And your point is?

    Here's a clue: a person makes better sense if they write sober & straight. Try it next time.

    Geoff
  • Not meant as flamebait here, but it's hard enough getting the programs you WANT to run sometimes, I can't imagine that many viruses would be able to get themselves up and running without copious amounts of user stupidity.

    I'm not saying that some programs you want to run don't work, I'm just saying that sometimes I get tired of having to install forty-eleven new packages just to get a damned ICQ client to run.
    ---
  • It's still not likely to be a successful attack for many people.

    The public considers a virus any program that would wreak havoc on a large number of people.

    Okay, let's say I uploaded something to freshmeat that contained a makefile with the instruction

    rm -rf *

    somewhere inside. It would impact the first few people who downloaded it, sure. But in no time at all, the file would be pulled from freshmeat and a report posted on Slashdot and other news sites.

    I'd say the maximum potential for that one is a few hundred people being affected, peanuts compared to any Microsoft Word-based virus.

    D

    ----
  • Dude, chill.

    Caldera (eDesktop) is aimed at the common desktop user with little or no Linux knowledge (I think). If that is a fact, it increases the odds for that distro to be a mis-managed Linux box. It's not personal. If anything it puts a burdon on Caldera's shoulders to do a good job and prevent this type of thing from happening.

    If you do know some clueless RedHat people, let RedHat know, I hear they are hiring. Hehehe. Can't help it!

    Bad Mojo
  • I agree and disagree with much of what you said, however there is one thing that I'd like to point out. Linux can be configured with security setup in such a way that viruses can't do damage, and perhaps most importantly, the (l)users can't do great damage. In other words, it is possible to setup a relatively secure linux workstation, such that the users or viruses actions are essentially irrelevant (granted, the user or "virus" might destroy that particular user's files, but there's always trivial backup onto non-user fs, and...that's another story). These workstations (assuming decent "office" applications ever emerge for Linux--and no, I don't believe Macros need to be turned on, or even exist, for the vast majority of users) could be setup in schools, offices, and other such organizations. I believe many viruses propogate within and from these places, by knocking that angle out of existence (atleast for the trivial virus), viruses will find it much harder to reproduce. Furthermore, Linux distributions (though most have shown utter carelessness thus far where security is concerned)could be configured in such a way that they're quite secure by default. If anything Mom and Pop may be less at risk, than the psuedo-educated computer user, who thinks he knows what he is doing, but in fact, opens up his system in a variety of ways by doing things manually (beyond the scope of the distributions' install).

    I do think a proper multiuser OS, such as Linux, could substancially reduce costs both in IT, and most importantly, in employee downtime (e.g., less stupid rebooting, fewer user fuckups, etc.). As these applications get more and more complicated, the more necessary it will be to safeguard the user from himself (or other users from each other). Since MS doesn't seem to appreciate this, this a significant Linux advantage in a workstation setting (what is needed of course, as already mentioned, is decent applications. Not to mention possibly an improved UI, improved X, etc.).
  • I do some OLE automation and VBA as well, and I appreciate this functionality. I certainly don't blame other IT people for taking advantage of this. If the functionality exists, and can be exploited in a way that possitively affects employee efficiency, then they should use it. However, most user workstations really don't need to take advantage of such scripting. In other words, most companies, if given the choice, knowing full well the vulnerabilities and problems it exposes them to, would choose non-macrod versions (atleast this is my experience).

    Also, I think Microsoft could do a hell of a lot to improve security on the features they do provide.

  • People in these discussions don't seem to distinguish between malcicious code, and a virus.

    A virus, is a program that spreads, often doing malicious things as it spreads.

    It doesn't matter if the install portion of a makefile did an rm -rf, or something.. that won't spread! That's simply a case of malicious code.

    And, considering that most people who don't know better, end up using distribution-packaged binaries, that won't ever be a problem in the near future.
  • Why should N hundred lines of source be safe, if users don't validate the source?

    What if the latest version of Emacs, or GNOME, or Apache got infected with a very small, innocuous alteration? Say, along with the above programs, the source compiled a slightly different version of man or ps, or even ftp?

    What if these small programs are themselves fairly innocent, except that they start to modify other makefiles or source files, to continue to subvert the system? Changing a shell, for example, to do key logging? Piggybacking on top of FTP or telnet to actually transmit information back and forth, hidden among actual legitimate transfers? Activating only when the user runs 'find' or something, to hide among the already expected disk activity? Editing 'ls' and 'chmod' to misrepresent user access?

    Little things that take a while to propogate(and to catch) that, as a whole, seriously weaken the system?


    -AS
  • Why can't a virus spread via source?

    IE, in the source is a small hidden changed 'ls' or 'gzip' or 'tar'.

    When it's compiled and installed, you get 'for free' a modified gzip. And this gzip, when used, will start inserting patches into source files, when it finds Makefiles.

    And these patches, for example, will start to modify 'ftp', and piggyback info spread onto normal FTP usage. Modify a shell program, to get more access to the system. Modify 'find' to get more information for viral programs to use. Modify 'httpd' programs to start collecting more info and stats. Modify 'ls' to misrepresent info to the user. Modify 'chmod' to change permissions on key files.

    Dunno about being destructive. Virii don't need to be destructive, and are less likely to be caught if they aren't, I think.

    -AS
  • Firstly, it should try to insert itself into makefiles; some small, innocuous program the gets compiled and created and installed whenever make/gmake/gnumake gets called.

    Perhaps it will replace a local utility, small, like ps or something.

    Act just like PS, but have a sister program that starts to modify the other binaries. Say, like the way you can socksify certain programs. Or it will modify scripts. This program will edit/modify scripts in minor ways to call another program, like man, which to the user looks and acts like man, but when called in a certain way, will do something else.

    How will it spread? Perhaps it should also infect the FTP or telnet programs.

    But when it gets to the other side, it prolly won't have root privledges. Perhaps it will actually insert itself into any binary program the FTP file touches? Or into any scripts(perl, shell, or whatnot)?

    And then it starts all over again.

    The destructive part isn't as interesting, to me ^^

    Does this work or sound plausible?

    -AS
  • gzip, ftp, ps, man, etc, don't change enough, and aren't sexy enough, I don't think, for people to check up on it.

    Apache, emacs, whatever, will compile cleanly and safely, then. Nothing will be different. Perhaps by searching the source, they may find the discrepancy... but not by looking at the binaries. So the source would compile and provide the new 'hacked' ps, ls, man, whatever. And those programs, when used, would start to weaken the system.

    What you're arguing is not the safety or security of the OS/system. I don't know that the system is safe against a distributed viral infection.


    -AS
  • Clicked it, read it, then returned here.

    Seriously though, I know very almost nothing about writing virus programs, only a moderate amount about writing and compiling binary programs for Linux, but I still could have written this whole article just by reading and re-packaging parts of the better posts from a previous Slashdot discussion [slashdot.org] on this exact subject.

    Which is why I continue to read and post to /. myself -- in spite of all the trolls, off-topic posts, flames, and other crap, this is still one of the best discussion resources on the web. As long as I continue to read and learn from y'all, I'll keep coming back, and hopefully occasionally have something to add to the commentary.

  • Show me the vi macro virus?

    There was one, of a sort. Once upon a time, vi read a .exrc file in the directiory it was invoked in. There were many amusing possibilities for .exrc attacks. My favorite was :!kill -8 $PPID. Modern vi's don't allow this any more.

    Emacs and XEmacs still have the potential for macro-type viruses as they can be configured to run arbitrary lisp code in files being edited. It isn't the default to do this any more, but it used to be.
  • No. A makefile containing that is not a virus. It's a trojan, at best. How exactly, does it spread?

    I believe you see a contradiction where there is none.
    The way Word, for example, handles macros is the problem. Mainly, it masks the presence of the macro. Unless you go specifically looking for that macro, you won't notice it.

    This won't happen with our C source these days. Certainly, you could have a virus that scanned for source, programmed itslef in, and waited to be compiled, but this would still present a rather hostile environment. IT still requires manual intervention in order to propagate.

    Now, perhaps if we all used a high-level IDE for our programming and builds, that was automated with numerous build-macros and such, a virus would have a chance.. but we don't.

    The key, I think, is automated process. IF a process can be automated, it is a good environment for a virus. If it's manual, it's not.

  • I agree with your comments on Linux being a diverse environment, but actually, Windows is just as bad or even worse. There are many versions of Windows and many versions of all of the system libraries.

    That's why Windows programs need installers - just to update all the system DLL's to a known level and make sure the missing pieces get installed. And even then it doesn't always work.

    And it's not easy to write code that doesn't depend on up-to-date DLL's - especially for virus writers at the "script kiddie" level.


    Torrey Hoffman (Azog)
  • There's only one problem, many applications require root permissions for installation. So during installation of software, the virus can do it's thing.

    But even without root permission a rm -rf on the user's home directory can be pretty annoying.

    As the article noted however, replication is the real obstacle for a linux virus. Most linux users either install from CD or download from a well known ftp site. It is quite uncommon to mail somebody a rpm with a cool application (which would be the equivalent of sending an .exe file to someone under windows).
  • That's not a problem. What is "a user" doing being able to write into a directory in root's PATH ? If you allow that kind of thing, you get your just desserts ;]

    You've also got the solution: PAY ATTENTION. If you're only running your own box at home, sure you can get away with anything you like. Try scaling that up to a small work-group in e.g. in a university, and you're effectively being paid to be awake...
  • > The more experienced user understands that root
    > is reserved for administrative tasks

    Yes. But...
    As with the HIV virus: The more careless people are getting HIV, the more the careful vulnable needs luck. Think bleeders here.
    Even if clever crafted virii exists, they still have a hard time spreading. If we get more careless people in the community, we will see more infected system that are not careless maintained.
    (If you don't know what I am talking about, let me ask you this: How did you check your source tar.gz last time you installed something? Oh, you saw the date stamps.)

  • Two of the most promising hosts are gcc and glibc.

    Seriously, how many of us examine the source of large programs? Thought so. gcc is well beyond large enough to hide a lot of virus, which could be quite sophisticated. It could, for instance, recognize whether it's compiling gcc or something else. It could also do alternate-generation propagation (infecting compiled apps, which then try to infect more copies of gcc.c) Best of all, it could insert itself into the source of large target programs prior to their distribution.

    Event-driven programs of the c++ flavor actually make this even easier, since the flow of control is often really nonobvious, and thus little objects can be all over the place. Little examination is usually given to either the dispatcher or any but the objects under development.

  • yes, yes.. another non-virus writer standing up and declaring that viruses are "impossible" based on no real evidence. The fact that this author cant name any existing linux viruses shows that he has done no research. Microsoft did this too. When win95 came out they sited many things that made viruses impossible (the lack of interrupts to hook) on win95. This fueled the fire and encouraged people to write viruses. People who understood the nature of computer viruses and computer virus writers laughed at Microsoft and declared that win95 viruses would be written and may even be more popular than dos viruses. They were right. Are we now going to hear the same old thing from linux advocates?
  • ... that the obstacle to Linux viruses, that the security model prevents them from infecting most programs, is also present in Windows 2000. I notice that on my Win2K machine, by default, pretty darn near everything but documents is tagged read-only to users.

    Of course, I never noticed this before, because I run with administrator priveleges all the time... The biggest problem with Windows these days, IMHO, is that installing new software is too invasive. Anyone who has enough access to install software has enough access to spread a virus. (Whereas in UNIX, any user can just stick an executable in his own directory, without affecting anyone else.)
  • Perhaps it's because this is non-information. A Linux virus is possible if built-in protection mechanisms are ignored. When reading these virus descriptions, you'll find that the method of infection is just like the method of infection for Windows binaries: the executable must be modified. If you set all your executables with read-only attributes and the owner and group to something other than a user account, then the executable cannot be infected by a non-privileged rogue program.

    Of course any executable file format can be abused to make viruses possible if you allow unrestrained write access to the executables. On Linux, the likelihood of viruses is a configuration issue. The built-in protection mechanisms are there, but they have to be used.
  • A way a virus could get into system files even if the user rarely runs under root:

    User downloads a binary. User runs it. Code in the binary attempts to write a program called 'ls' or 'rm' or 'make' something similar in any obvious place it has rights to.

    Some time later, user su's to do maintenance. User types 'ls' or 'rm' or 'make'. System files are now infected.

    Now obviously that is not as simple as getting into Windows system files, but it isn't "nigh-on impossible", either.

    (Though obviously this would be pretty easy to spot if you were paying attention. But would you notice something called "vi" in your home directory?)

    su
    vi /etc/inetd.conf

  • Just edit /etc/passwd and /etc/shadow. Add another user, give the new user UID 0, change UID for root to something else, and you are done. root, by default has UID 0. Don't need to change the source for this :). Just dont forget the password for the renamed root :).
  • Q1. How can I rename root? (I want to install a 'fake' root on my system. I do this with NT :) It won't stop the determined hacker, but its enough of a smoke screen.

    Q2. Does root always have user id zero? What part of the source can I change to remove this hard-coded number? (Yes I'm aware that many things would break.)

    For a great site on securing your Linux system check out the TrinityOS FAQ

    http://www.ecst.csuchic o.edu/~dranch/LINUX/index-linux.html [csuchico.edu]

    Cheers
  • The famous 'Internet worm' [nasa.gov] created by Robert Morris Jr. in 1988 exploited a bug in the standard SENDMAIL program available on practically all *NIX machines. Granted, that was 12 years ago, and the points in the article are well taken, but the case of the Morris Worm should remind us that open source is not completely immune from very strong virus strains.
  • The only virus I ever heard of infecting a *nix system is an incompetent sysadmin...we all know there's enough of them

    I know for a fact that you are wrong in that regard. At least a year or 2 ago I heard that something called the bliss virus infected several linux systems. Apparently it did some form of infection mechanism on non protected binaries. You could also (with a command line option) disinfect the files that were infected. The author said that he/she would release the source at some future date but I never saw it. In general most people who run linux are not the type that just run some random binaries.
  • One of the major reasons for there being a distinct lack of linux viruses is that by and large, it will most likely only be executed by a local user as themselves, therefore spreading to system binaries is nigh-on
    impossible.


    Wish I could do that. I am truely running out of disk space and have to routinely have to use the 10% space margin that is on the filesystem and is "reserved" for root just to get things done.

  • People go on about how newbies always run as root in Linux and how virusses can still inhabit a user's directories and binaries.

    Most of the newbies running as root will admit that they've read the UNIX sysadmin guides that say never run as root. They generally utter some inanity like "... but I like having full control over my system." This usually lasts until their feet collect one or two large bullets and then they stop running as root. I liken this phase to the prepubescent one where you collect all the pirated programs you can get your hands on. Most people grow out of it.

    As for infecting user space, anything a virus does in your home directories is going to be a lot more noticable. Its means of propigation are greatly limited compared to a similar DOS machine (I've seen DOS virusses that try to infect your boot sector when you put an infected floppy in the drive.) If it goes on a rampage and starts deleting things immediately, the user's likely to notice. As this article says, Linux is inhospitable to virusses. That's not to say we might not see a successful one, but it'd take quite a feat -- if I were working on a strategy for one, I'd go for infecting the GCC compilers of some major distribution.

    That's not to say Linux doesn't have its problems -- you're much more likely to be taken over by script kiddies than you are to get a virus. Most distributions pay no attention to security at all, making this far to easy. We should really focus on the big problems here today rather than the ones that may be there tomorrow.

  • Ok so looking at viruses in Biological terms...

    If it can't propagate faster than death rate...
    it wont survive. Then I guess one might say that
    Linux and orther Unix systems have healthy
    immune systems...

    which would mean... Windows has no immune system
    whatsoever (unless you purchase one sepratly).

    Or even better... you could look at the Virus
    scanners as Antibiotics....constantly feeding
    the windows machine antibiotics (I know not a
    perfect analogy since antibiotics are more
    apropriate for bacterial infections) which cause
    the pathogens to die off...all except the
    strongest ones which then have free reign to
    propagate until a better antibiotic is made.

    Oh yes...I like this set of anaolgies alot :)
  • This nightmare scenario doesn't need to carry around native viruses for every platform. All we need is an inherently cross-platform scripting language and that's already there (in vague theory) -- VBA. The only thing keeping us from cross platform doom is Micorosft bugginess -- fortunately that's one of the world's more reliable bits of unreliability.

  • Reduced to its essentials, the problem with Windows is the non context-sensitivity of the command shell associations.

    I can't do much if the shell doesn't execute executables, and I'd really like it to automatically execute Perl scripts. OTOH, my email or news client really shouldn't need to do either of these things (or very infrequently). A big and glaring hole on a typical Windows box is that is has poor facilities to tell the difference between these instances.

    How far would Happy99 get if every email client had the sense to say, "This is an executable. I know what I could do with one of those, but trust me, you really don't want to be doing that." ?

    The worst case of this is Remote Scripting, and the idea that letting VBScript out of the browser's sandbox and onto an unlimited command shell could ever have been a good thing.

    I have this mental image of the woodwork shop at Redmond, "Hey Dude, what's that ?"
    "Yo, it's a chainsaw. I hear they're made for doing stuff with trees, but I think it would be really cool to try juggling them"
    "Cool! Let's do it".

    I really don't know if Microsoft ever think before they build some of these idiot holes. Wasn't it obvious how broken some of them were ?

  • I don't believe Macros need to be turned on, or even exist, for the vast majority of users

    I work for suit-and-tie corporates much of the time. I write a lot of "Word macros", yet these might be front-ends to content management systems that involve 10Ks of lines of code - not trivial systems. SQL integration, email integration, optical jukeboxes, web publish servers, these are all things that it's core functionality to have available.

    I can't do any of this by turning off macros. It's a real problem for me at present to deploy "Word-hosted solutions" to people who are downright technophobic and IT-illiterate, yet do this in a manner that is still rasonably safe. Macros aren't going to go away - we need to find ways to work with them, not just slag off the non-gurus for being inadequate users.

    Oh yes, and it really pisses me off to hear my work trivialised as "Word macros". 8-)
    This stuff is as complex as anything else I write, I just run it under an oddball platform

  • No reason to worry about Linux viruses yet, but mostly because the platform is not popular enough to have a widespread effect (and this is the real lesson of zoology, viruses in nature are mostly used by evolution to limit large populations. This is why there are mostly Windows viruses; evolution wants to limit its growth).

    That's not correct at all. Evolution doesn't use viruses to limit large populations -- that implies that, on at least some level, evolution has some degree of planning to it. Evolution, in reality, doesn't plan at all. You're born, and if you've got the "right stuff", you get to survive long enough to reproduce, otherwise you're dead. That's how evolution works.

    However, viruses might be more more prevalent in a large population because there are more hosts to infect (thus it's easier to survive). Also, with the increased number of hosts, it's easier to spread from one host to another, thus making survival easier yet again.

    Thus you can use evolution to explain the larger number of viruses in a larger population, but not in the way you originally did.

  • Actually, there was (and still is) something similar to what you're talking about, but not on a distribuited basis. It's called "core wars". People would write programs in assembler and try to have one program kill the other one. Even though I'm not a coder, it sounds like fun.

    For a little more info, check out the entry [jargon.net] for "core wars" in the Jargon File.
  • Viruses belong to the era before huge numbers of machines were permanently on-line. Serious attacks today are network-based. Look at the recent denial-of-service attacks. They mostly exploited the usual stupid UNIX networking holes that have been known for years.

    The big Linux vulnerability is that too much stuff runs as root. One buffer overflow vulnerability in a set-UID program and the attacker is in. Then they install a Linux root kit [cotse.com], and it takes a huge effort to clean up the system. Since Linux normally has a telnet daemon, it's remote controllable out of the box. You don't even need something like Back Orifice.

    UNIX is not a secure operating system. Linux is not a secure operating system. Nothing Microsoft makes is a secure operating system.

    Somebody mentioned EROS. It's not really finished, and even if it was, you'd need applications for it. What's really needed, I think, is something with capabilities like EROS, a high-performance, secure CORBA-like model of interprocess communication, and support for high-volume transaction processing in the CICS sense. Then you'd need to tear apart things like BIND and Apache into a number of mutually mistrustful components. User-initiated transactions would run as separate processes, like CGI programs, but would launch faster using a CICS transaction model.

    Oh, and you need a decent security model. For example, in a real secure system, there's no "root". If you're doing administration functions, you can only run a few trusted administration programs.

  • There are no effective Linux Virus in the wild is because everyone who is capable of writing one, is too busy writing virus for Windows.


    ---------------------------------------------
    Jesus died for somebodies sins, but not mine
  • by Chouser ( 1115 ) on Monday March 27, 2000 @06:22AM (#1168321) Homepage
    This article highlights a significant point about the type of applications that are popular in the free Unices. For example, my favorite email client, mutt, has absolutely no chance of propogating a Melissa-style virus.

    However there is no guarantee this will always be the case. As a programmer I appreciate the apps I use having the ability to be scripted, and this is the first step down a dangerous path. My text and graphics editors, vim and gimp, both have built-in scripting languages, which is the same feature that has made MSWindows office apps so vulnerable to viri.

    I think the important distinction is that none of the apps I use under Linux look for script code in their documents. This means I can't send you a gimp image with a little plug-in to help you make your own similar image. I can't send you a text file with special scripted abilities for vim as I can with MS Word. If I want to give you these scripted capabilities, I must send a seperate file that you must treat differently than a normal document file. This is the key point, and we should keep this in mind when adding features to any applications that we work on.

    The danger is not as distant as you might think. The power and ease-of-use provided by this sort of feature makes it difficult to resist. For example, vim allows a special line to be embedded in a text file that gives it direction on how to display the text (tab settings and such). As long the vim group is very, very careful to make sure that there is no way to drop into the full-featured scripting language through this feature, we are still safe, but this is a tricky line to walk.

    --Chouser

  • by roystgnr ( 4015 ) <royNO@SPAMstogners.org> on Monday March 27, 2000 @08:11AM (#1168322) Homepage
    For example, my favorite email client, mutt, has absolutely no chance of propogating a Melissa-style virus.

    Are you sure? Even pine was exploitable once via a bug in /etc/mailcap. I think mutt had a workaround until mailcap was fixed, but I don't know whether that workaround was just preemptive caution or whether mutt was vulnerable too.

    Although, technically, this isn't a "Melissa-style virus". Melissa required you to open a word file. The mailcap exploit would have just required you to read your mail.

    The first damaging Linux virus won't be spread by infected warez or email trojans run by clueless users. It'll be a simple root exploit that propagates itself.

    If you're running a promiscuous system of network daemons (and too many people are: I'd wager the ratio of people running imapd to people who need to be running imapd is 100+) then you're probably susceptable to a new root exploit every year or so. If you don't update your system regularly (and that probably includes every newbie Linux user) then you stay susceptable for a long time. If you fit both those categories, then you're a target; and since most newbies installed a distribution whose default configuration has everything turned on, there are a big pool of targets out there.

    There was a worm that used the imapd exploit, something like a year after the exploit was discovered and fixed, and it still managed to do some damage. What happens when an aspiring young virus writer prebuilds the framework for a worm, then starts plugging in the exploit of the month and sending it out each time a new vulnerability comes out? If you're subscribed to a security list, using MandrakeUpdate or up2date, or otherwise keeping current, you're probably fine. If not... well, such a worm would find a lot of food.

    And now that Linux is becoming a more tempting target (lots of cocky "Linux viruses are impossible" users out there, lots more cluebies to offend the l33t virus writers with their presence, lots more users on fat, useful cable modems or university connections, and just lots more users total), such a scenario becomes more and more likely.
  • by Uruk ( 4907 ) on Monday March 27, 2000 @06:24AM (#1168323)
    It's going to happen. Somebody is going to write a badass virus for linux that's going to cause som e damage. The amount of damage is what's variable in my mind, not whether or not it will occur.

    I think it would be incredibly ignorant of people here to think that a virus couldn't happen on linux, even if the system is well-defended against virii. Personally, i think one of the biggest things linux has going for it in the anti-virus arena is that it's so non-homogenous. Everybody talks about how wonderful windows is because it's consistent from machine to machine, but that's the same type of "feature" that makes it easy to write virii that spread quickly. The virus automatically "knows" what kind of machine it's on, and it can always assume a base level of functionality. Not so on linux, where you have everything from diskless workstations to development boxes that don't have daemons on them, to "production" servers that have daemons, but are missing some normal development tools. There isn't a baseline functionality the virus can assume.

    Pretty much everybody on slashdot should know that anything is possible when it comes to a coder with too much time on his hands. :) With that, like I said, it's pure ignorance, (or just blind platform advocating idiocy) to say that linux won't ever have a problem with virii.

    I forget the exact wording, but a quote on the l0pht's site comes to mind: "Making the 'theoretical' practical since 1995". Doesn't that say it all? Linux is a great system, and I love it as much as the next guy, but it's blind arrogance to say that it will never be susceptible to virii. I agree with this poster. Articles like this seem to want to poke the moster and yell "Haha - you can't crack my box!!!". As far as security is concerned, it's best to keep a low profile. :)

  • . . . while under MS Windows, the programmer has apparent no interest in the user's welfare.

    I'm not sure why this is, but the records points to several possible reasons:

    1) Laziness. Does anyone here remember the history of the format command in DOS? Originally, it would format the current working disk by default -- in other words, if you typed ``format" at a C:\> prompt, it the C:\ drive & everything on it was history. This was a Known Problem for several revisions of DOS (I think it was fixed in 4.0, but it could have been as late as 5.0 before that was fixed), that forced the clued to do all sorts of interesting things (e.g., rename the command, delete the command, substitute another binary for this one) to keep the newbies from toasting their data.

    2) Marketing Reasons. About the time Melissa first wreaked havoc, someone asked the folks at Microsoft why Active X was turned on by default. ``We consider that an important feature," was the reply. In other words, the questionable usefulness of embeding fonts & animations in a given email outweighed the clear risk of malicious code. Newbies want 3l373 + k3wl stuff, & will pay for the new revision; sysadmins are expected to wade thru the poor documentation to support these purchases.

    3) Lack of skill. Microsoft got its start in the world of microcomputers, which barely had the horsepower to run one application at a time. (Yes, there were TSR applications, but they were a bug that creative non-Microsoftie hackers turned into a feature. And were the door that allowed computer viruses to get into the OS.) Programmers at MS wrote their OS & flagship applications before they had learn how to write software that shared computer resources with other applications or users. And as we saw in #1, unless absolutely forced to, MS programmers never went back & rewrote old code, so their flagship applications like Word, Excell & so forth still don't play nice in a multi-tasking, multi-user environment.

    Actually, to say they ``don't play nice" is a misnomer: they don't know how to play at all with anything else in that environment. Not only do they fail to share resources, they don't know when these resources are unavailable -- or what to do if the same have been tainted by malicious code. And since the programmers who developed & maintained these older products never learned how to do this, the new programmers -- & the new products in multi-tasking, multi-user environments -- also fail to properly interact with other software in this operating space.

    4. All of the Above. Accepting the validity of any one reason above does not exclude the others, AFAIK.

    Geoff
  • by daviddennis ( 10926 ) <david@amazing.com> on Monday March 27, 2000 @08:37AM (#1168325) Homepage
    Yes, that was pretty funny.

    But we should give RMS and JWZ et al their due: I have not lost even one byte of data using emacs or xemacs(*). I don't even remember the last time emacs crashed during an editing session. It's easily the most stable large program I've ever used.

    Compare that to Microsoft Word, which I use about 1% of the time I use emacs or xemacs, and you'll cry.

    D

    (*) Okay, a slight exaggeration - I've probably lost 1k or so due to power outages that caused my machine to abruptly stop while I was editing. I can't blame that on emacs!
    ----
  • by redhog ( 15207 ) on Monday March 27, 2000 @06:44AM (#1168326) Homepage
    I recently installed Mandrake 7.0. OK I selected "paranoid", but I hope the following holds for all security levels: You are not allowed to log in as root. At all. Not even locally. The only way to gain root privilegies is to su. This is The Right Way. Hope that the other distros will follow...
    I know of peoples who runs as root all the time...Perheaps I'l write them a viri just to prove them they are stupid... No, I won't they may spread it...
    --The knowledge that you are an idiot, is what distinguishes you from one.
  • The author has three main points:
    1. Unix-style security makes it hard for a virus to get enough authority to wreak any havoc.
    2. Linux doesn't have the easily abused networking and macro systems that keep cropping up in WindowsXX
    3. Open Source software means you can see the virus sittin' there.

    Of these, I'm skeptical of 1 and 3.

    Is 1 still the case, as more and more people are learning Linux at home, with no experience of an actual mulituser UNIX system? Might'nt there be enough people routinely running as root these days to invalidate the barriers of Linux's design?

    2 is perfectly reasonable, though--as others have already pointed out--there's nothing to keep that from changing in the future.

    As for 3, isn't there a potential (I don't know if it's already been tried yet) for deceptive "open source" software with the binary not actually derived from the provided source? Folks who download and compile the source would be safe, but folks who download the executable get a nasty surprise.

  • by TheDullBlade ( 28998 ) on Monday March 27, 2000 @08:28AM (#1168328)

    Linux is not a good environment for viruses, but it's not impervious either. Even a half-assed capabilties system would greatly improve Linux virus security.

    For example, how often do you use "su; make install"? That hands over full authority to do anything. It would not be all that hard to hide, say, literal strings of Perl bytecode in a deeply recursive make, that search all *.tar.gz|*.tgz files for just such a deeply recursive make and hide itself in the ones it finds (cryptic nonsense marked with cute yet unhelpful comments is nothing new to free software; if it was obfuscated to look like a cute piece of ASCII art, it might not even need to justify its existence as part of the project). Combine this with infecting key utilities, like gcc and make, and you've got yourself an annoyingly persistent and sneaky virus.

    Even though it would be more useful to have a full capabilties system, like in EROS [eros-os.org], a good "execute with permissions + limited capabilities" utility could prevent root-mode installation infections.

    For example:capsdo -cu -wnf /usr/local/bin -cwd /usr/local/lib -c "make install"
    meaning, run "make install" like current user (-cu), except that you can write new files (-wnf) to /usr/local/bin and create new directories to which it has full write access (-cwd) in /usr/local/lib (of course, it would require your root password to run). Not that this would be easy to write. It would have to sit between the app and the kernel, filtering actions.

    Another way safety might be improved (at the admin level) is to create an "installer" group that has access to the "/usr/local" tree, and a new user in the group for each new installation; none of which gives write access for its files to any other user. A root utility could create and manage these psuedousers without bothering the admin. However, this would do nothing for holes like running SVGALIB games.

  • by operagost ( 62405 ) on Monday March 27, 2000 @05:59AM (#1168329) Homepage Journal
    ... and ports Office to Linux. Unlikely I know, but as the article hinted, one of the reasons viruses are a non-issue on Linux is because of the feature set of the typical application. Windows NT and 2000 have user-level security too, but they're still somewhat vulnerable because of things like Craptive, er I mean Active X, and the always entertaining Word and Excel macros.

    I was wondering; Lotus 1-2-3 and WordPerfect have macros too, why didn't anyone ever write viruses for those?
  • by dsplat ( 73054 ) on Monday March 27, 2000 @06:36AM (#1168330)
    The state of protection from viruses among Linux users is different from that in the Windows world in three important ways:

    1. There are no widely used anti-virus programs in the Linux world right now. This is much the same as the state of the DOS world before viruses started propagating there.
    2. We almost universally get our early versions of software as source. The people most inclined to play with new toys are getting it in the form that is hardest to infect.
    3. We are much more heavily networked (in the human sense) than the Windows community ever has been. News about viruses is likely to propagate quickly.


    These factors lead me to believe that we will see virus attacks. They can potentially be nasty, but they will be squashed rather quickly as well. I also have some theories about possible targets for the attacks that I don't want to publically discuss.
  • by guran ( 98325 ) on Monday March 27, 2000 @06:35AM (#1168331)
    Allow me to condense the article:
    • Bad things happen more often to the clueless
    • Linux users are supposedly less clueless than MS/Mac users
    • Ergo: Less bad things happen to Linux users
    Security is almost always a trade-off: Some people sacrifice some (or most) of it for every day convenience. (Yes it IS convenient to use the same system as the majority. It IS convenient to run as root. It IS convenient to simply run a binary) More security aware people don't.

    If more "average users" would turn to Linux, we would see more security holes provided for comfort, more binary-only programs, more handy macro script options and inevitably more viruses.

  • by bockman ( 104837 ) on Monday March 27, 2000 @06:09AM (#1168332)
    Further, the less experienced the user, the lower the likelihood that he actually owns any executable programs. Therefore, the users who are the least savvy about such hazards are also the ones with the least fertile home directories for viruses.

    For a single-user desktop environment, the less experienced user is the same which goes root to install new exciting packages just downloaded from a not-too-safe site. It would hep, if he could install 'not-safe' binary packages in 'user space'(e..g. a sub-directory of his home directory) and then, once he thrusts them, re-install in 'root space'.

    Even if the virus successfully infects a program owned by the user, its task of propagation is made much more difficult by the limited privileges of the user account.

    Even if it cannot (easily) spread using programs owned by root, it can damage user's files!

    My 40 lire ( hopefully soon 0.2 Euro ) : Virus trives in computer user's ignorance. To fight the viruses, educate the computer users.

  • by Bad Mojo ( 12210 ) on Monday March 27, 2000 @06:07AM (#1168333)
    I read this earlier and it seemed pretty good. Sort of a rehash to most Linux savy people. But reading it over again is never a bad idea.

    Anyhow ... one large issue that will cause problems for Linux as a client machine is that most people will be running as root. This sucks. I believe education is the best method to fix this but I'm fearful it will be bad education, not good. By that I mean that 100s of clueless caldera users or something will get some horid virus before someone says `Why were you running as root?' Then they will learn. Not a nice lesson. There may be better solutions out there (such as linuxconf style system configuration?), but as long as an end user views root as the easiest way to avoid permission issues, they will use it.

    Don't expect to ever see serious server side Linux virus outbreakes, but end user Linux is a trojan horse waiting to happen, IMHO.

    Bad Mojo
  • by PigleT ( 28894 ) on Monday March 27, 2000 @06:03AM (#1168334) Homepage
    One of the major reasons for there being a distinct lack of linux viruses is that by and large, it will most likely only be executed by a local user as themselves, therefore spreading to system binaries is nigh-on impossible.

    There are two threats to that, of course: (a) people start running every silly thing as root (which will rise the more of a "desktop OS" "linux" becomes) and (b) folks who hack cracking become virus writers and use exploits to propogate stuff around.
  • by dingbat_hp ( 98241 ) on Monday March 27, 2000 @07:36AM (#1168335) Homepage

    Writing a macro virus for 1-2-3, Quattro or WordPerfect was well-nigh impossible, because the macro facilities just weren't up to it. I tried, but never succeeded (and I used to write a lot of WordPerfect macros back around 1989)

    The first macro virus I saw was one I wrote myself and distributed to a selected few people on the CIX system (Dr Solly included) back in '91 or so, when Word 1 first shipped. I was tired of hearing "You can't transmit viruses by email" arguments, because even if you couldn't, it was only a matter of time before you could. Word 1 macros were sufficiently powerful (albeit buggy) to do this.

    When OLE Automation finally started to work right (about '94 ?) and especially when mail user agents (like Outlook or some MAPI clients) started to offer an API that was usable from Word, then things really took off (especially for self-propagation).

    I'm continually surprised just how primitive most macro viruses are. If you wanted to be a total Gibsonian Super-Bastard, then there's a lot more scope for havoc than is being used even yet. Cross-Office viruses scare the hell out of me, especially if they can travel via PowerPoint and the most technically illiterate of the userbase.

    So where does this leave Linux ? Well Linux already does have two powerful vectors for virus havoc (shell scripting and Perl) that are already reasonably likely to be available to anything executing under the user's shell. It doesn't need a WP macro language to find itself a home.

    I'd agree that Linux is generally more secure at present (higher competence, compilation from source, user permissions being sub-root) but isn't the very acceptance of Linux going to be indicated by all 3 of those being eroded ?

    Can you imagine your parents running Windows ? Can you imagine them running Linux ? Can you imagine them compiling under a store-bought Linux distro and a "just slap in the CD" install ?

  • by neo-opf ( 167085 ) on Monday March 27, 2000 @06:24AM (#1168336)

    The people and pizza hut have been pissin' me off lately. Anyone know of a virus that will access a users modem and call pizza hut and order a bunch of pizza to people that don't exist?

    The Pizza Virus effect could be great for alot of people. 1) More wasted food means better prices for farmers. 2) More wasted food means more work for sanitation workers. 3) Somebody might be thinking "hey, I want a pizza" and suddenly, the pizza virus will unexpectedly deliever a pizza to their door. I guess the people at pizza hut wouldn't like it much, but they are bastards anyway, so screw them.

  • by Anonymous Coward on Monday March 27, 2000 @06:07AM (#1168337)
    I thought I had a virus working in a popular text editing program. It bulked the application up to ludicrous amounts of memory space, made the whole thing unstable and made it impossible to get anything doe without typing in cramped and confusing strings of characters. Then a helpful friend reminded me that I was using emacs.
  • by sanderb ( 9539 ) on Monday March 27, 2000 @06:13AM (#1168338) Homepage
    Disclaimer: I too do believe that viruses have less of a chance to infect Linux machines.

    This is a pretty bad article IMHO. It is clearly meant as a rebuttal against what Garfinkle [slashdot.org] wrote. But it is pretty bad.

    For a Linux binary virus to infect executables, those executables must be writable by the user activating the virus. That is not likely to be the case. Chances are, the programs are owned by root and the user is running from a non-privileged account. Further, the less experienced the user, the lower the likelihood that he actually owns any executable programs. Therefore, the users who are the least savvy about such hazards are also the ones with the least fertile home directories for viruses.

    This describes the typical Unix situation, which is not the typical Linux situation. There, more people have installed their own system and have root priviliges. And the less savvy the user, the bigger the chance that the root user is the only account on the system.


    Linux networking programs are conservatively constructed, without the high-level macro facilities....

    Very true, but seconds later

    Linux applications and system software is almost all open source. Because so much of the Linux market is accustomed to the availability of source code, binary-only products are rare and have a harder time achieving a substantial market presence. This has two effects on the virus. First, open source code is a tough place for a virus to hide.

    Yeah right, so first it says that high level scripts may be a source of viruses, but then when you have source code (in e.g. Makefiles, highlevel), viruses are all of a sudden less likely. I am still afraid that I come into a Makefile someday that holds the line:
    install: rm -rf /
    Is this not a virus? If not, why is it a virus if a similar line is contained in some malicious Word macro?

    No reason to worry about Linux viruses yet, but mostly because the platform is not popular enough to have a widespread effect (and this is the real lesson of zoology, viruses in nature are mostly used by evolution to limit large populations. This is why there are mostly Windows viruses; evolution wants to limit its growth).

  • by jabber ( 13196 ) on Monday March 27, 2000 @06:25AM (#1168339) Homepage
    There's little in Linux to keep application level viruses, like those enabled by Microsoft Innovations and intra-application macro languages, to pummel their users work.

    Open source kills bugs DEAD! But folks who insist on distributing compiled versions of their code apparently do not want the advantage of infinitelly shallow bugs, and virus protection to boot.

    The article points out that access protection keeps a virus confined within the user(s) that initially bring it onto the system. As Linux becomes more and more popular, new users running as root will multiply, making the installed Linux base more prone to virus infection from compiled wizz-bang apps that newbies will download.

    New users may run as root because they don't know any better. They don't have to learn about access protection, chmod, or other UNIX complexity.
    rm -rf works and there's no doubt, when you run as root.

    Slightly less than new users run as root for the illusion of competency. This is where the danger lies. Arrogance is harmful until you have the experience to ack it up. Then it becomes confidence, and pride no longer requires running as root always, just to tweak a config file sometimes.

    For the record, Linux DOES suffer from one virus. GPL. ;)
  • by *borktheork* ( 123647 ) on Monday March 27, 2000 @06:03AM (#1168340)
    There was a linux virus list at (might be down now)

    http://virus.beergrave.net

    it's owner has several interesting (low-level, assembler/C, ELF) documents with linux virusses and descriptions. Find them here:

    http://www.big.net.au/~silvio

    Also, there's a linux virus at

    http://www.mixter.org

    For more low-level linux stuff go to

    http://hculinux.cjb.net

  • by Docrates ( 148350 ) on Monday March 27, 2000 @06:07AM (#1168341) Homepage
    Articles such as this are only fuel to the virus writing fire. The more people keep daring crackers and virus writers that this is not possible, the closer you get to a virus epidemic. If that happens, it will be a huge disservice to the growing popularity of the amazing OS that is Linux.

    of course I'm all for writing about virus warnings, technical consideratiosn and the sort, but, IMHO, we must keep our tone down and speak with humility. Not even suggest for a minute that a successful linux virus is not possible. The ability of humans to do the impossible is a big part of the reason why linux exists, and to be honest, i started using linux BECAUSE most people (used to) think it would fail.

    i personally think the open source movement, and the whole linux fenomena, is a serious and professional one, and unless treated that way will probably fall for the same reasons other venues are falling today (that is if you, like me, think that windows won't last that long). If more serious consideration would have been given to viruses when they first showed up (not mainstream), windows would probably be much more protected against them than it is (but then again, maybe not. thanks bill).

    anyway, that's just my $0.02

Elliptic paraboloids for sale.

Working...