The Short Life And Hard Times Of A Linux Virus 191
Sun Tzu writes, "There are several reasons for the non-issue of the Linux virus. Most of those reasons a Linux user would already be familiar with, but there is one, all important, reason that a student of evolution or zoology would also appreciate ... The article is at sitereview.org. "
Re:root and other security prevention (Score:1)
Anyway, a virus doesn't need root access to propagate, it just needs to propagate *somewhere*, whether that's your files, or some other user's files, or even over the network.
A lot of people here seem to believe that Unix file permissons will greatly hamper a potential virus, but this is dubious in the case of the typical Linux installation (ie, a user's desktop machine). Quite a bit of Linux users actually administer their own machine, and thus tend to switch around between UID's (with 'su'). If a virus is written with the capability of watching for and acquiring passwords (as the user types them), then file permissions are no longer a barrier.
Re: Oh good, we can all relax now (Score:1)
--Chouser
Point taken. (Score:1)
But this wasn't my main point. I will concede that 1) mutt has more holes that I'm aware of, and 2) my system is suseptible to much more than just strict "Melissa-style" viri. My main point, however, was how tempting it is to build holes into higher-level apps. I suspect that security is a significant concern in most open-source network and operating system projects. But I'm afraid that high-level application projects, such as Office clones, tend to not worry about security much. It's for this reason that I bothered posting at all -- we need to make sure that newbie-targetted "productivity" apps don't come with huge holes built in.
--Chouser
Re: Oh good, we can all relax now (Score:1)
Being root most of the time while using a unix system is a Bad Thing. Having a unix system with only a root user (and no 'normal' users) is a Very Bad Thing.
I have installed both RedHat 6.1 and RedHat 6.2 beta a couple of times each in the last few months. Therefore I'm quite aware of how the installation procedure goes. As I said in my last post, RedHat makes it unhappy for a newbie user to do the Bad Things named in the above paragraph.
This means that I am happy about what RedHat has done with their installation procedure -- I think it is a Good Thing.
--Chouser
Re: Oh good, we can all relax now (Score:1)
>without creating a normal user account, and makes it uncomfortable to
>use a root account for normal work. I don't have any proof that Linux
>installations are as often or more often set up correctly than most
>Unices, but I don't have much reason to believe the contrary.
This changed with RedHat 6.1. You create a user account along with the root account during the install process. If you're going to slam Redhat for something at least bother finding out something about it first.
Re:It forgot ACLs (Score:1)
Or are you suggesting that one do this on the RPMS directory on the CD? What if one does not want to install all of the packages on the CD? What if one does not even have a CD?
Furthermore, why would you use "--force --nodeps" to begin with? The only legitimate use I can think of for that (in the absence of broken packages) is if the dependencies were met in a way that rpm does not know about, such as installing RPMs on a Debian system, or software that was compiled from source.
How are either of these methods simpler than just typing "apt-get install <list-of-packages-you-want>"?
--
Re:It forgot ACLs (Score:1)
Try Debian and apt-get.
--
Re:Actually, no - (Score:1)
--
Actually, no - (Score:1)
--
RPMs and DEBs are the weak point (Score:1)
Affecting a binary doesnt make sense on Linux, as users almost never distribute them directly - if they distribute software, they send source-tarballs, RPMs or DEBs. And the RPMs are the real weakness: a virus could infect the RPM to spread itself. The solution for this problem would be the use of digital signatures.
Re:It forgot ACLs (Score:1)
The only reason the install process should need root access would be to write to /usr/local/bin and /usr/local/lib, or whatever.
These directories could (should, IMO) belong to some other UID, like apps. In this case the install would only need the permissions of that UID.
This would limit the damage that a virus could do to wiping out the applications you have installed, but not being able to touch the base install.
The distributions should be putting more focus on setting up stuff like this. There should be intermediate security levels instead of just user and root.
Re:Oh good, we can all relax now (Score:1)
Your mak example is a jake. It is nothing like a viruses. it is not a infected file is just a bad program. Ferthere not only would it quickly be ditected it would only effect the retivitly small (and becoming smaller) number of users who aculy compile there programs instead of just using a package. (packeages have check sums and could, and should, be signed to prevent infection on them; as well as the fact that most people get packages of CD's)
Re:Oh good, we can all relax now (Score:1)
> Is this not a virus?
I would call it a Trojan. One of the purposes of a virus is to spread itself. This would have been detected the first time someone did a make install. If it instead infected all Makefiles on your system with itself I would call it a virus.
Re:Oh good, we can all relax now (Score:1)
But, there you go. 'Suspicious makefile?' Since when? As another poster pointed out, if somebody tried to slip a mickey like that through the distribution system, it would get pulled from freshmeat immediately. It only has to crash a couple of victims, and the word is out.
Paranoia is useful, and in my business, it's a job qualification. But, as you mentioned, we can't manually verify everything. So if paranoia is not to become paralysis, what do we do?
"We can check if we really want to" is actually a mighty shield, one which is really only available to open source users. There may be an ambiguity lurking in your statement of this principle. "We" do check everything...I don't personally do it, but it is done all the same, by the community.
Do you remember the tcpwrapper flap? Somebody posted a patch to the code at a primary source site (U. of Eindhoven, was it?) which snagged security info and mailed it to a Hotmail account. That was discovered in short order, and the news was everywhere. The crocked tcpwrapper was pulled, the Hotmail account was canned, and everybody was agog for maybe 2 weeks. "How could something like that happen in Open Source!" But, you know, that whole affair provided a demonstration of the hostile environment that open source software provides to malicious code, whether it be a virus, a bomb, or a trojan.
No, you don't want to blithely trust everything, but you don't have to rely only on your own powers for safety. By myself I could no more close every possible loophole in my system than I could write a kernel to compete with linux. But, just as I benefit from the work of a community which provides the OS that I'm running, I can use the eyes of the community to watch out for those pesky 'rm -rf ' things too.
Re:Oh good, we can all relax now (Score:1)
That's not a virus, it's actually a bomb. The thing that defines a virus is that it replicates itself manyfold before destroying its 'host' (if it ever does so). The point of the article is that the linux installed base is too hostile to virus reproduction for them to become a major threat.
Btw, if you're worried about a makefile of suspicious provenance, just say "make -n " and check out all the commands it wants to run before you execute them.
Macafee? too late (Score:1)
gg
This could change... (Score:1)
Re:Open Source? (Score:1)
With open source, many people (though not necessarily the end users) thoroughly read the code. If there is anything suspect, many people will notice it and spread the word. With many people collaborating on projects it would be difficult for anyone to slip something bad in unnoticed.
Uhh... (Score:1)
Re:fp again (Score:1)
Pedant, but... (Score:1)
install: rm -rf
Is this not a virus? If not, why is it a virus if a similar line is contained in some malicious Word macro?
Well, if you want to split hairs, the rm -rf
Quidquid latine dictum sit, altum viditur.
Re:Open Source? (Score:1)
I do concede that having the sources open, does mean that once someone does look at it, they would most likely make efforts to educate people to avoid that program. But if someone sent me the source for an app, with a virus written into it, I'd never know.
More a cultural than a technical question? (Score:1)
I wonder if the rarity of Linux viruses and trojan horses is more a matter of user culture than inherent technical security. Linux users care about their operating system. Windows users, on the other hand, don't generally feel a great loyalty to their platform, and so the more adolescent and malicious among them feel no compunction in compromising it.
As a supporting example, consider the Macintosh (which has a similarly loyal user base, and probably a larger one than Linux). Macintosh viruses are pretty rare also, and it's certainly not because of any inherent security of the MacOS. (One pops up every once in a while, like the Autostart virus a year or two ago, but they get stamped out pretty quickly. The last time I saw a Mac virus on my own machine was more than ten years ago, and I have downloaded gigabytes worth of shareware binaries, etcetera.) (And then there are spillover viruses, like Word macro viruses infecting Word for Macintosh, but these are the exception that proves the rule.)
I just don't buy the argument that Linux is immune to viruses, especially since a large fraction of Linux users have root access. (You don't have to be completely clueless and log in as root all the time, either...imagine a virus that installs a replacement alias for 'su' in your shell, so that it gets root access the next time you do some administration. One can come up with countless other attacks.) (Also, I would consider self-propagating DDoS scripts like Trinoo to be viruses.)
Certainly, the lack of Windows "features" in GNU/Linux software helps security, but it seems a far cry from a complete explanation of why we see so few problems.
Re:App virii and hubris (Score:1)
I have never had to spend much time explaining to any NT user why su is such a powerful tool. It is a key usability difference between windows and unix. One rarely runs as root under unix because you have the ability to su whenever you need to tweak that config file. NT requires you to logout; login as a privilaged user; preform prilaged task; then logout; login as a non-privalaged user. The whole proccess takes at least 10-15 minutes even to do something trivial like add a local printer. If your lucky you don't have to reboot (W2K much better in this regard.) So when ever possible NT users run with local administrator access all the time. Win9X has root as the only login of course.
Trojan Horses (Score:1)
Please take the time to learn the terminology before posting things like that. If it's of any help, I have a collection of many anti-virus FAQs here [claws-and-paws.com].
linux has more dangerous viruses... (Score:1)
Re:The real reason... (Score:1)
Well, keep writing letters to Loki, and maybe your favorite Windoze virus will get ported. Of course by then, it will be last years's virus, but you take what you can get...
---
I'm not worryed (Score:1)
However I'm more fearful some terrorist will find a way to attack the United States millitary than I am worryed about some cracker writing a Linux virus.
[I am not even slightly worryed about such attacks.. I am worryed that some of our people may get hurt in attempts but thats all]
Viruses like the Y2K bug are not magical and do not do magical things.
They require one thing.. unrestricted access. On Linux this means root. If your really paranoid there is allready a project to prevent anyone from modifying programs on the system even as root.
Linux allready provides this protection to some degree. Some programs when running can not be modifyed by anyone... piriod.
Viruses remind us how importent security is for EVERY system we use. They are not a given. They are posable if you get stupid. Otherwise they are a non-event.
On the other hand if a virus can get root so can a script kiddy and there is a lot more a random script kiddy can do to your computer than any virus could ever do.
Yes I grow tired of virus warnings.
Linux has been around long enough.. if a virus epedemic were to happen.. it would have allready come to pass...
Viruses will be writen and they will be crushed. Being fearful of what dose not as yet exist dose nothing.
Let us cross the bridge when we come to it...
When virus experts want to clame the big "badass" Linux virus is comming it is up to us to crush this myth.
There is no Linux virus epedemic comming. There is no baddass virus.
No Unix or Unixlike system is anywhere near as suseptable to viruses as Dos/Windows this is a fact.
If we don't stand up for Linux now and let the general public believe the Linux virus epedemic is comming we set ourselfs up...
Unix venders will not hesistae to take Linux down a peg.
And Microsoft would have a spin...
Linux didn't get this far by being fearful something "horrable" might go wrong...
Linux is stable.. we premote this.. Occasionally Linux boxes crash.. this hasn't hurt us...
What could be worse than premoting Linux as "User Friendly"?
There will be viruses and we will deal with them.
But don't pretend this is an eppedemic...
An eppedemic is a 10 year old operating system running 20 year old viruses... (Windows running Dos viruses)... never being able to do anything to stop viruses.
Re:Let's be realistic here. (Score:1)
Viruses are not magical...
They work becouse the computer trusts any given program to "play nice"...
Linux dosn't....
Linux viruses exist.. they are dead...
Linux has allready tempted the crackers.. We allready premote Linux as secure...
Linux is allready high profile.. we are the "challanger to Microsoft".. on the news..
Hidding under a rock is nither practical nore posable...
True viruses will be made and they will be crushed..
"Anything is posable" true...
But many things are incredably unlikely... this is one of those things...
Re:Until MS comes along ... (Score:1)
It will be yet annother thing stupid newbes will do and they will get bitten quickly...
Not nessisarly vea a virus...
But this one ranks up with turnning power off before shutting down... and trunning on telnetd with no root password...
Or.. and the all time favoret... entering commands someone gives you on IRC...
As for the e-mail virus myth... It would still be a myth if Microsoft hadn't tryed to tie everything into e-mail...
The whole notion that a virus could be containned within text is still silly...
Better watch out.. this post might contain a virus.... ohhhhhhh
Re:What Viruses are out there? (Lookee here!) (Score:1)
Or at the higher level (shell script virus):
http://www.math.umn.edu/~ riordan/security/unix_virus.html [umn.edu]Re:Mandrake (Score:1)
The unsuspecting newbie will probably always tell it to start X by default, so the problem still exists.
Haven't tried 'paranoid', so I don't know if this behavior is the same there.
Iceaxe
Let's not let our guard down (Score:1)
Which is why . isn't in the default PATH (Score:1)
+++++
Re:It forgot ACLs (Score:1)
(RPMs should jolly well say what the required dependencies are - and if you don't have a package of that name and/or version to match you can override it with rpm --no-deps, of course. Even so you probably shouldn't if you're going to keep your machine clean.)
Re:It forgot ACLs (Score:2)
I'm not saying that some programs you want to run don't work, I'm just saying that sometimes I get tired of having to install forty-eleven new packages just to get a damned ICQ client to run.
Maybe this is part of the reason why viruses find Linux such an inhospitable environment. Most Windows boxes have a common set of code running on them. On a Linux box, a virus can't assume anything--there are many kernel versions, many different shells, mail clients, etc. Libraries vary from machine to machine, if a virus needs a certain lib to work, that lib may not even be installed, or it may be the wrong version.
For what it's worth, if you can't get a package (RPM, whatever) to install because of dependencies, you can always download the source and build the program yourself. Package managers expect to find specific library versions, but the build system included with most GNU and other OSS does a little bit more work to find the libraries the code calls for. Often, when you run the configure script, if you don't have a required library, or if you have the library but it's too old to work, you'll get a nice message explaining that the lib needs to be a certain rev or later, and maybe even a URL for the latest version.
I rarely use RPMs anymore, simply because it's much easier to build the programs I need myself. Try it, you might like it.
Re:It forgot ACLs (Score:2)
Re:It forgot ACLs (Score:2)
(Though obviously this would be pretty easy to spot if you were paying attention. But would you notice something called "vi" in your home directory?)
On a properly configured system, it's not a problem. The search path should NEVER include current directory, and if you have a ~user/bin, it should be last. In that case with your scenario, the real vi will be executed, and the fake one will just gather dust until you notice it's existance and rm it.
Re:Interesting (Score:2)
Then they will learn. Not a nice lesson.
That reminds me of a newbie howto I once read. It went step by step through installing Slackware. It was written for someone with no Unix experiance.
Step ten: Your Linux system is now installed. Still running as root, type cd /; rm -rf.
Step 11: Now you know why you should never just type what someone tells you to when you are root unless you KNOW what it will do. Go back to step 1.
Source/binaries and exploits (Score:2)
Of course, most RPMs are downloaded from a central server, not traded or swapped on BBS-like local sites, which makes it harder. Such a RPMed exploit could possibly do other things, such as dynamically patch files sent by ftpd/httpd and infect any executables (standalone or in
Or one could just be unimaginative and modify tcpd to contain a remotely-activated 'sleeper' denial-of-service client or backdoor root shell.
Intresting.... (Score:2)
eh...that was rambleing...note to self, don't take support calls and post to slashdot at the same time...
Sgt Pepper
Mutt does not make you immune. (Score:2)
You say that Mutt makes you immune from a Melissa-style virus. All the Melissa approach needs to succeed is to trick enough users into running an executable so that it spreads faster than it dies. So all I have to do is to compose a message that will trick, say, 1/10 of the Linux users into running it, if on average each execution will send out more than 10 copies. The program would search for your aliases, as stored by mutt, elm, Netscape's mailer, or whatever, and send them all a message.
If a message that appeared to come from your best friend (and, in fact, it would be from your best friend, if he were suckered) told you to run a program, would you run it? If so, the Melissa approach would get you, whether or not you use Mutt.
Re:Under UNIX, the programmer tends to be your fri (Score:2)
>EVERYONE who writes code for ALL flavors of M$ OS's as ignorant savages.
My, did I hit a nerve here? Are you a Windows programmer?
>After all, it's self evident that if you don't design single user apps for single user machines
>as if they were meant for multiple users on servers you MUST be a moron.
Not my point. There are some nice single-tasking OSes out there -- Palm OS is one that comes to mind. Straightforward, doesn't leak memory. Nice work -- especially when you consider the OS was written by a handful of people while Microsoft was throwing dozens of people at their WinCE project.
But the Palm OS is designed to run for months without a reset or reboot. So you can't have memory leaks.
I'm talking about Windows NT/2K -- last I heard, MS said it was a server OS. Servers run multiple processes for multiple users, so I'd assume that this environment is multi-user & multi-tasking. But then, I think I'm superior -- according to you, & MS software shouldn't be expected to do so much.
Why MS can't write a reliable OS -- or applications for one -- with all of these qualities baffles me. They had access to the technology -- they wrote Xenix, & Dave Cutler was the project Manager for VMS before he led the NT group. They have the money to hire good programmers with experience in this kind of environment. I would think they could make NT just as reliable.
And what ought to stick in the craw of any Windows programmer is while the coders at Redmond are being paid to do it right, a bunch of amateurs without access to the technology figured out how to do it in their spare time. Based on these facts, I'd say that writing a reliable multi-tasking, multi-user OS is not rocket science any more. So why CAN'T Microsoft write better software?
>All I can say to that is: You flunked engineering economics, didn't you?
And your point is?
Here's a clue: a person makes better sense if they write sober & straight. Try it next time.
Geoff
Re:It forgot ACLs (Score:2)
I'm not saying that some programs you want to run don't work, I'm just saying that sometimes I get tired of having to install forty-eleven new packages just to get a damned ICQ client to run.
---
Make virus (Score:2)
The public considers a virus any program that would wreak havoc on a large number of people.
Okay, let's say I uploaded something to freshmeat that contained a makefile with the instruction
rm -rf *
somewhere inside. It would impact the first few people who downloaded it, sure. But in no time at all, the file would be pulled from freshmeat and a report posted on Slashdot and other news sites.
I'd say the maximum potential for that one is a few hundred people being affected, peanuts compared to any Microsoft Word-based virus.
D
----
Re:Interesting (Score:2)
Caldera (eDesktop) is aimed at the common desktop user with little or no Linux knowledge (I think). If that is a fact, it increases the odds for that distro to be a mis-managed Linux box. It's not personal. If anything it puts a burdon on Caldera's shoulders to do a good job and prevent this type of thing from happening.
If you do know some clueless RedHat people, let RedHat know, I hear they are hiring. Hehehe. Can't help it!
Bad Mojo
Workstation angle (Score:2)
I do think a proper multiuser OS, such as Linux, could substancially reduce costs both in IT, and most importantly, in employee downtime (e.g., less stupid rebooting, fewer user fuckups, etc.). As these applications get more and more complicated, the more necessary it will be to safeguard the user from himself (or other users from each other). Since MS doesn't seem to appreciate this, this a significant Linux advantage in a workstation setting (what is needed of course, as already mentioned, is decent applications. Not to mention possibly an improved UI, improved X, etc.).
Re:Workstation angle (Score:2)
Also, I think Microsoft could do a hell of a lot to improve security on the features they do provide.
The problem (Score:2)
A virus, is a program that spreads, often doing malicious things as it spreads.
It doesn't matter if the install portion of a makefile did an rm -rf, or something.. that won't spread! That's simply a case of malicious code.
And, considering that most people who don't know better, end up using distribution-packaged binaries, that won't ever be a problem in the near future.
Why is Source safe? (Score:2)
What if the latest version of Emacs, or GNOME, or Apache got infected with a very small, innocuous alteration? Say, along with the above programs, the source compiled a slightly different version of man or ps, or even ftp?
What if these small programs are themselves fairly innocent, except that they start to modify other makefiles or source files, to continue to subvert the system? Changing a shell, for example, to do key logging? Piggybacking on top of FTP or telnet to actually transmit information back and forth, hidden among actual legitimate transfers? Activating only when the user runs 'find' or something, to hide among the already expected disk activity? Editing 'ls' and 'chmod' to misrepresent user access?
Little things that take a while to propogate(and to catch) that, as a whole, seriously weaken the system?
-AS
Root access... (Score:2)
IE, in the source is a small hidden changed 'ls' or 'gzip' or 'tar'.
When it's compiled and installed, you get 'for free' a modified gzip. And this gzip, when used, will start inserting patches into source files, when it finds Makefiles.
And these patches, for example, will start to modify 'ftp', and piggyback info spread onto normal FTP usage. Modify a shell program, to get more access to the system. Modify 'find' to get more information for viral programs to use. Modify 'httpd' programs to start collecting more info and stats. Modify 'ls' to misrepresent info to the user. Modify 'chmod' to change permissions on key files.
Dunno about being destructive. Virii don't need to be destructive, and are less likely to be caught if they aren't, I think.
-AS
Lets build a theoretical Linux Virus! (Score:2)
Perhaps it will replace a local utility, small, like ps or something.
Act just like PS, but have a sister program that starts to modify the other binaries. Say, like the way you can socksify certain programs. Or it will modify scripts. This program will edit/modify scripts in minor ways to call another program, like man, which to the user looks and acts like man, but when called in a certain way, will do something else.
How will it spread? Perhaps it should also infect the FTP or telnet programs.
But when it gets to the other side, it prolly won't have root privledges. Perhaps it will actually insert itself into any binary program the FTP file touches? Or into any scripts(perl, shell, or whatnot)?
And then it starts all over again.
The destructive part isn't as interesting, to me ^^
Does this work or sound plausible?
-AS
Source level virii (Score:2)
Apache, emacs, whatever, will compile cleanly and safely, then. Nothing will be different. Perhaps by searching the source, they may find the discrepancy... but not by looking at the binaries. So the source would compile and provide the new 'hacked' ps, ls, man, whatever. And those programs, when used, would start to weaken the system.
What you're arguing is not the safety or security of the OS/system. I don't know that the system is safe against a distributed viral infection.
-AS
Slashdot wins again... (Score:2)
Seriously though, I know very almost nothing about writing virus programs, only a moderate amount about writing and compiling binary programs for Linux, but I still could have written this whole article just by reading and re-packaging parts of the better posts from a previous Slashdot discussion [slashdot.org] on this exact subject.
Which is why I continue to read and post to /. myself -- in spite of all the trolls, off-topic posts, flames, and other crap, this is still one of the best discussion resources on the web. As long as I continue to read and learn from y'all, I'll keep coming back, and hopefully occasionally have something to add to the commentary.
Re:Until MS comes along ... (Score:2)
There was one, of a sort. Once upon a time, vi read a
Emacs and XEmacs still have the potential for macro-type viruses as they can be configured to run arbitrary lisp code in files being edited. It isn't the default to do this any more, but it used to be.
Re:Oh good, we can all relax now (Score:2)
I believe you see a contradiction where there is none.
The way Word, for example, handles macros is the problem. Mainly, it masks the presence of the macro. Unless you go specifically looking for that macro, you won't notice it.
This won't happen with our C source these days. Certainly, you could have a virus that scanned for source, programmed itslef in, and waited to be compiled, but this would still present a rather hostile environment. IT still requires manual intervention in order to propagate.
Now, perhaps if we all used a high-level IDE for our programming and builds, that was automated with numerous build-macros and such, a virus would have a chance.. but we don't.
The key, I think, is automated process. IF a process can be automated, it is a good environment for a virus. If it's manual, it's not.
Re:It forgot ACLs (Score:2)
That's why Windows programs need installers - just to update all the system DLL's to a known level and make sure the missing pieces get installed. And even then it doesn't always work.
And it's not easy to write code that doesn't depend on up-to-date DLL's - especially for virus writers at the "script kiddie" level.
Torrey Hoffman (Azog)
Re:It forgot ACLs (Score:2)
But even without root permission a rm -rf on the user's home directory can be pretty annoying.
As the article noted however, replication is the real obstacle for a linux virus. Most linux users either install from CD or download from a well known ftp site. It is quite uncommon to mail somebody a rpm with a cool application (which would be the equivalent of sending an
Re:It forgot ACLs (Score:2)
You've also got the solution: PAY ATTENTION. If you're only running your own box at home, sure you can get away with anything you like. Try scaling that up to a small work-group in e.g. in a university, and you're effectively being paid to be awake...
Re:This could change... (Score:2)
> is reserved for administrative tasks
Yes. But...
As with the HIV virus: The more careless people are getting HIV, the more the careful vulnable needs luck. Think bleeders here.
Even if clever crafted virii exists, they still have a hard time spreading. If we get more careless people in the community, we will see more infected system that are not careless maintained.
(If you don't know what I am talking about, let me ask you this: How did you check your source tar.gz last time you installed something? Oh, you saw the date stamps.)
Re:Lets build a theoretical Linux Virus! (Score:2)
Seriously, how many of us examine the source of large programs? Thought so. gcc is well beyond large enough to hide a lot of virus, which could be quite sophisticated. It could, for instance, recognize whether it's compiling gcc or something else. It could also do alternate-generation propagation (infecting compiled apps, which then try to infect more copies of gcc.c) Best of all, it could insert itself into the source of large target programs prior to their distribution.
Event-driven programs of the c++ flavor actually make this even easier, since the flow of control is often really nonobvious, and thus little objects can be all over the place. Little examination is usually given to either the dispatcher or any but the objects under development.
Another "expert" (Score:2)
It's worth noting... (Score:2)
Of course, I never noticed this before, because I run with administrator priveleges all the time... The biggest problem with Windows these days, IMHO, is that installing new software is too invasive. Anyone who has enough access to install software has enough access to spread a virus. (Whereas in UNIX, any user can just stick an executable in his own directory, without affecting anyone else.)
Re:Why is this rated as funny? (Score:2)
Of course any executable file format can be abused to make viruses possible if you allow unrestrained write access to the executables. On Linux, the likelihood of viruses is a configuration issue. The built-in protection mechanisms are there, but they have to be used.
Re:It forgot ACLs (Score:2)
User downloads a binary. User runs it. Code in the binary attempts to write a program called 'ls' or 'rm' or 'make' something similar in any obvious place it has rights to.
Some time later, user su's to do maintenance. User types 'ls' or 'rm' or 'make'. System files are now infected.
Now obviously that is not as simple as getting into Windows system files, but it isn't "nigh-on impossible", either.
(Though obviously this would be pretty easy to spot if you were paying attention. But would you notice something called "vi" in your home directory?)
su /etc/inetd.conf
vi
Re:root and other security prevention (Score:2)
root and other security prevention (Score:2)
Q2. Does root always have user id zero? What part of the source can I change to remove this hard-coded number? (Yes I'm aware that many things would break.)
For a great site on securing your Linux system check out the TrinityOS FAQ
http://www.ecst.csuchic o.edu/~dranch/LINUX/index-linux.html [csuchico.edu]
Cheers
One of the greatest virii ever was a *NIX worm (Score:2)
Re:fp again (Score:2)
I know for a fact that you are wrong in that regard. At least a year or 2 ago I heard that something called the bliss virus infected several linux systems. Apparently it did some form of infection mechanism on non protected binaries. You could also (with a command line option) disinfect the files that were infected. The author said that he/she would release the source at some future date but I never saw it. In general most people who run linux are not the type that just run some random binaries.
Re:It forgot ACLs (Score:2)
impossible.
Wish I could do that. I am truely running out of disk space and have to routinely have to use the 10% space margin that is on the filesystem and is "reserved" for root just to get things done.
Two common fallacies (Score:2)
Most of the newbies running as root will admit that they've read the UNIX sysadmin guides that say never run as root. They generally utter some inanity like "... but I like having full control over my system." This usually lasts until their feet collect one or two large bullets and then they stop running as root. I liken this phase to the prepubescent one where you collect all the pirated programs you can get your hands on. Most people grow out of it.
As for infecting user space, anything a virus does in your home directories is going to be a lot more noticable. Its means of propigation are greatly limited compared to a similar DOS machine (I've seen DOS virusses that try to infect your boot sector when you put an infected floppy in the drive.) If it goes on a rampage and starts deleting things immediately, the user's likely to notice. As this article says, Linux is inhospitable to virusses. That's not to say we might not see a successful one, but it'd take quite a feat -- if I were working on a strategy for one, I'd go for infecting the GCC compilers of some major distribution.
That's not to say Linux doesn't have its problems -- you're much more likely to be taken over by script kiddies than you are to get a virus. Most distributions pay no attention to security at all, making this far to easy. We should really focus on the big problems here today rather than the ones that may be there tomorrow.
One way to look at it... (Score:2)
If it can't propagate faster than death rate...
it wont survive. Then I guess one might say that
Linux and orther Unix systems have healthy
immune systems...
which would mean... Windows has no immune system
whatsoever (unless you purchase one sepratly).
Or even better... you could look at the Virus
scanners as Antibiotics....constantly feeding
the windows machine antibiotics (I know not a
perfect analogy since antibiotics are more
apropriate for bacterial infections) which cause
the pathogens to die off...all except the
strongest ones which then have free reign to
propagate until a better antibiotic is made.
Oh yes...I like this set of anaolgies alot
Re:Until MS comes along ... (Score:2)
This nightmare scenario doesn't need to carry around native viruses for every platform. All we need is an inherently cross-platform scripting language and that's already there (in vague theory) -- VBA. The only thing keeping us from cross platform doom is Micorosft bugginess -- fortunately that's one of the world's more reliable bits of unreliability.
Re:Until MS comes along ... (Score:2)
Reduced to its essentials, the problem with Windows is the non context-sensitivity of the command shell associations.
I can't do much if the shell doesn't execute executables, and I'd really like it to automatically execute Perl scripts. OTOH, my email or news client really shouldn't need to do either of these things (or very infrequently). A big and glaring hole on a typical Windows box is that is has poor facilities to tell the difference between these instances.
How far would Happy99 get if every email client had the sense to say, "This is an executable. I know what I could do with one of those, but trust me, you really don't want to be doing that." ?
The worst case of this is Remote Scripting, and the idea that letting VBScript out of the browser's sandbox and onto an unlimited command shell could ever have been a good thing.
I have this mental image of the woodwork shop at Redmond, "Hey Dude, what's that ?"
"Yo, it's a chainsaw. I hear they're made for doing stuff with trees, but I think it would be really cool to try juggling them"
"Cool! Let's do it".
I really don't know if Microsoft ever think before they build some of these idiot holes. Wasn't it obvious how broken some of them were ?
Re:Workstation angle (Score:2)
I don't believe Macros need to be turned on, or even exist, for the vast majority of users
I work for suit-and-tie corporates much of the time. I write a lot of "Word macros", yet these might be front-ends to content management systems that involve 10Ks of lines of code - not trivial systems. SQL integration, email integration, optical jukeboxes, web publish servers, these are all things that it's core functionality to have available.
I can't do any of this by turning off macros. It's a real problem for me at present to deploy "Word-hosted solutions" to people who are downright technophobic and IT-illiterate, yet do this in a manner that is still rasonably safe. Macros aren't going to go away - we need to find ways to work with them, not just slag off the non-gurus for being inadequate users.
Oh yes, and it really pisses me off to hear my work trivialised as "Word macros". 8-)
This stuff is as complex as anything else I write, I just run it under an oddball platform
Misunderstanding of Evolution (Score:2)
That's not correct at all. Evolution doesn't use viruses to limit large populations -- that implies that, on at least some level, evolution has some degree of planning to it. Evolution, in reality, doesn't plan at all. You're born, and if you've got the "right stuff", you get to survive long enough to reproduce, otherwise you're dead. That's how evolution works.
However, viruses might be more more prevalent in a large population because there are more hosts to infect (thus it's easier to survive). Also, with the increased number of hosts, it's easier to spread from one host to another, thus making survival easier yet again.
Thus you can use evolution to explain the larger number of viruses in a larger population, but not in the way you originally did.
Re:Virus war (Score:2)
For a little more info, check out the entry [jargon.net] for "core wars" in the Jargon File.
Viruses are obsolete (Score:2)
The big Linux vulnerability is that too much stuff runs as root. One buffer overflow vulnerability in a set-UID program and the attacker is in. Then they install a Linux root kit [cotse.com], and it takes a huge effort to clean up the system. Since Linux normally has a telnet daemon, it's remote controllable out of the box. You don't even need something like Back Orifice.
UNIX is not a secure operating system. Linux is not a secure operating system. Nothing Microsoft makes is a secure operating system.
Somebody mentioned EROS. It's not really finished, and even if it was, you'd need applications for it. What's really needed, I think, is something with capabilities like EROS, a high-performance, secure CORBA-like model of interprocess communication, and support for high-volume transaction processing in the CICS sense. Then you'd need to tear apart things like BIND and Apache into a number of mutually mistrustful components. User-initiated transactions would run as separate processes, like CGI programs, but would launch faster using a CICS transaction model.
Oh, and you need a decent security model. For example, in a real secure system, there's no "root". If you're doing administration functions, you can only run a few trusted administration programs.
The real reason... (Score:2)
---------------------------------------------
Jesus died for somebodies sins, but not mine
be wary of scripts in documents (Score:3)
However there is no guarantee this will always be the case. As a programmer I appreciate the apps I use having the ability to be scripted, and this is the first step down a dangerous path. My text and graphics editors, vim and gimp, both have built-in scripting languages, which is the same feature that has made MSWindows office apps so vulnerable to viri.
I think the important distinction is that none of the apps I use under Linux look for script code in their documents. This means I can't send you a gimp image with a little plug-in to help you make your own similar image. I can't send you a text file with special scripted abilities for vim as I can with MS Word. If I want to give you these scripted capabilities, I must send a seperate file that you must treat differently than a normal document file. This is the key point, and we should keep this in mind when adding features to any applications that we work on.
The danger is not as distant as you might think. The power and ease-of-use provided by this sort of feature makes it difficult to resist. For example, vim allows a special line to be embedded in a text file that gives it direction on how to display the text (tab settings and such). As long the vim group is very, very careful to make sure that there is no way to drop into the full-featured scripting language through this feature, we are still safe, but this is a tricky line to walk.
--Chouser
Be less confident (Score:3)
Are you sure? Even pine was exploitable once via a bug in
Although, technically, this isn't a "Melissa-style virus". Melissa required you to open a word file. The mailcap exploit would have just required you to read your mail.
The first damaging Linux virus won't be spread by infected warez or email trojans run by clueless users. It'll be a simple root exploit that propagates itself.
If you're running a promiscuous system of network daemons (and too many people are: I'd wager the ratio of people running imapd to people who need to be running imapd is 100+) then you're probably susceptable to a new root exploit every year or so. If you don't update your system regularly (and that probably includes every newbie Linux user) then you stay susceptable for a long time. If you fit both those categories, then you're a target; and since most newbies installed a distribution whose default configuration has everything turned on, there are a big pool of targets out there.
There was a worm that used the imapd exploit, something like a year after the exploit was discovered and fixed, and it still managed to do some damage. What happens when an aspiring young virus writer prebuilds the framework for a worm, then starts plugging in the exploit of the month and sending it out each time a new vulnerability comes out? If you're subscribed to a security list, using MandrakeUpdate or up2date, or otherwise keeping current, you're probably fine. If not... well, such a worm would find a lot of food.
And now that Linux is becoming a more tempting target (lots of cocky "Linux viruses are impossible" users out there, lots more cluebies to offend the l33t virus writers with their presence, lots more users on fat, useful cable modems or university connections, and just lots more users total), such a scenario becomes more and more likely.
Let's be realistic here. (Score:3)
I think it would be incredibly ignorant of people here to think that a virus couldn't happen on linux, even if the system is well-defended against virii. Personally, i think one of the biggest things linux has going for it in the anti-virus arena is that it's so non-homogenous. Everybody talks about how wonderful windows is because it's consistent from machine to machine, but that's the same type of "feature" that makes it easy to write virii that spread quickly. The virus automatically "knows" what kind of machine it's on, and it can always assume a base level of functionality. Not so on linux, where you have everything from diskless workstations to development boxes that don't have daemons on them, to "production" servers that have daemons, but are missing some normal development tools. There isn't a baseline functionality the virus can assume.
Pretty much everybody on slashdot should know that anything is possible when it comes to a coder with too much time on his hands.
I forget the exact wording, but a quote on the l0pht's site comes to mind: "Making the 'theoretical' practical since 1995". Doesn't that say it all? Linux is a great system, and I love it as much as the next guy, but it's blind arrogance to say that it will never be susceptible to virii. I agree with this poster. Articles like this seem to want to poke the moster and yell "Haha - you can't crack my box!!!". As far as security is concerned, it's best to keep a low profile.
Under UNIX, the programmer tends to be your friend (Score:3)
I'm not sure why this is, but the records points to several possible reasons:
1) Laziness. Does anyone here remember the history of the format command in DOS? Originally, it would format the current working disk by default -- in other words, if you typed ``format" at a C:\> prompt, it the C:\ drive & everything on it was history. This was a Known Problem for several revisions of DOS (I think it was fixed in 4.0, but it could have been as late as 5.0 before that was fixed), that forced the clued to do all sorts of interesting things (e.g., rename the command, delete the command, substitute another binary for this one) to keep the newbies from toasting their data.
2) Marketing Reasons. About the time Melissa first wreaked havoc, someone asked the folks at Microsoft why Active X was turned on by default. ``We consider that an important feature," was the reply. In other words, the questionable usefulness of embeding fonts & animations in a given email outweighed the clear risk of malicious code. Newbies want 3l373 + k3wl stuff, & will pay for the new revision; sysadmins are expected to wade thru the poor documentation to support these purchases.
3) Lack of skill. Microsoft got its start in the world of microcomputers, which barely had the horsepower to run one application at a time. (Yes, there were TSR applications, but they were a bug that creative non-Microsoftie hackers turned into a feature. And were the door that allowed computer viruses to get into the OS.) Programmers at MS wrote their OS & flagship applications before they had learn how to write software that shared computer resources with other applications or users. And as we saw in #1, unless absolutely forced to, MS programmers never went back & rewrote old code, so their flagship applications like Word, Excell & so forth still don't play nice in a multi-tasking, multi-user environment.
Actually, to say they ``don't play nice" is a misnomer: they don't know how to play at all with anything else in that environment. Not only do they fail to share resources, they don't know when these resources are unavailable -- or what to do if the same have been tainted by malicious code. And since the programmers who developed & maintained these older products never learned how to do this, the new programmers -- & the new products in multi-tasking, multi-user environments -- also fail to properly interact with other software in this operating space.
4. All of the Above. Accepting the validity of any one reason above does not exclude the others, AFAIK.
Geoff
Re:Linux virus #1 (Score:3)
But we should give RMS and JWZ et al their due: I have not lost even one byte of data using emacs or xemacs(*). I don't even remember the last time emacs crashed during an editing session. It's easily the most stable large program I've ever used.
Compare that to Microsoft Word, which I use about 1% of the time I use emacs or xemacs, and you'll cry.
D
(*) Okay, a slight exaggeration - I've probably lost 1k or so due to power outages that caused my machine to abruptly stop while I was editing. I can't blame that on emacs!
----
Mandrake (Score:3)
I know of peoples who runs as root all the time...Perheaps I'l write them a viri just to prove them they are stupid... No, I won't they may spread it...
--The knowledge that you are an idiot, is what distinguishes you from one.
Are Users Still Clueful Enough? (Score:3)
Of these, I'm skeptical of 1 and 3.
Is 1 still the case, as more and more people are learning Linux at home, with no experience of an actual mulituser UNIX system? Might'nt there be enough people routinely running as root these days to invalidate the barriers of Linux's design?
2 is perfectly reasonable, though--as others have already pointed out--there's nothing to keep that from changing in the future.
As for 3, isn't there a potential (I don't know if it's already been tried yet) for deceptive "open source" software with the binary not actually derived from the provided source? Folks who download and compile the source would be safe, but folks who download the executable get a nasty surprise.
Linux needs capabilities (Score:3)
Linux is not a good environment for viruses, but it's not impervious either. Even a half-assed capabilties system would greatly improve Linux virus security.
For example, how often do you use "su; make install"? That hands over full authority to do anything. It would not be all that hard to hide, say, literal strings of Perl bytecode in a deeply recursive make, that search all *.tar.gz|*.tgz files for just such a deeply recursive make and hide itself in the ones it finds (cryptic nonsense marked with cute yet unhelpful comments is nothing new to free software; if it was obfuscated to look like a cute piece of ASCII art, it might not even need to justify its existence as part of the project). Combine this with infecting key utilities, like gcc and make, and you've got yourself an annoyingly persistent and sneaky virus.
Even though it would be more useful to have a full capabilties system, like in EROS [eros-os.org], a good "execute with permissions + limited capabilities" utility could prevent root-mode installation infections.
For example:capsdo -cu -wnf /usr/local/bin -cwd /usr/local/lib -c "make install" /usr/local/bin and create new directories to which it has full write access (-cwd) in /usr/local/lib (of course, it would require your root password to run). Not that this would be easy to write. It would have to sit between the app and the kernel, filtering actions.
meaning, run "make install" like current user (-cu), except that you can write new files (-wnf) to
Another way safety might be improved (at the admin level) is to create an "installer" group that has access to the "/usr/local" tree, and a new user in the group for each new installation; none of which gives write access for its files to any other user. A root utility could create and manage these psuedousers without bothering the admin. However, this would do nothing for holes like running SVGALIB games.
Until MS comes along ... (Score:3)
I was wondering; Lotus 1-2-3 and WordPerfect have macros too, why didn't anyone ever write viruses for those?
A point was missed (Score:3)
These factors lead me to believe that we will see virus attacks. They can potentially be nasty, but they will be squashed rather quickly as well. I also have some theories about possible targets for the attacks that I don't want to publically discuss.
Well, well, well (Score:3)
If more "average users" would turn to Linux, we would see more security holes provided for comfort, more binary-only programs, more handy macro script options and inevitably more viruses.
Yes, BUT ... (Score:3)
For a single-user desktop environment, the less experienced user is the same which goes root to install new exciting packages just downloaded from a not-too-safe site. It would hep, if he could install 'not-safe' binary packages in 'user space'(e..g. a sub-directory of his home directory) and then, once he thrusts them, re-install in 'root space'.
Even if the virus successfully infects a program owned by the user, its task of propagation is made much more difficult by the limited privileges of the user account.
Even if it cannot (easily) spread using programs owned by root, it can damage user's files!
My 40 lire ( hopefully soon 0.2 Euro ) : Virus trives in computer user's ignorance. To fight the viruses, educate the computer users.
Interesting (Score:4)
Anyhow
Don't expect to ever see serious server side Linux virus outbreakes, but end user Linux is a trojan horse waiting to happen, IMHO.
Bad Mojo
It forgot ACLs (Score:4)
There are two threats to that, of course: (a) people start running every silly thing as root (which will rise the more of a "desktop OS" "linux" becomes) and (b) folks who hack cracking become virus writers and use exploits to propogate stuff around.
Re:Until MS comes along ... (Score:4)
Writing a macro virus for 1-2-3, Quattro or WordPerfect was well-nigh impossible, because the macro facilities just weren't up to it. I tried, but never succeeded (and I used to write a lot of WordPerfect macros back around 1989)
The first macro virus I saw was one I wrote myself and distributed to a selected few people on the CIX system (Dr Solly included) back in '91 or so, when Word 1 first shipped. I was tired of hearing "You can't transmit viruses by email" arguments, because even if you couldn't, it was only a matter of time before you could. Word 1 macros were sufficiently powerful (albeit buggy) to do this.
When OLE Automation finally started to work right (about '94 ?) and especially when mail user agents (like Outlook or some MAPI clients) started to offer an API that was usable from Word, then things really took off (especially for self-propagation).
I'm continually surprised just how primitive most macro viruses are. If you wanted to be a total Gibsonian Super-Bastard, then there's a lot more scope for havoc than is being used even yet. Cross-Office viruses scare the hell out of me, especially if they can travel via PowerPoint and the most technically illiterate of the userbase.
So where does this leave Linux ? Well Linux already does have two powerful vectors for virus havoc (shell scripting and Perl) that are already reasonably likely to be available to anything executing under the user's shell. It doesn't need a WP macro language to find itself a home.
I'd agree that Linux is generally more secure at present (higher competence, compilation from source, user permissions being sub-root) but isn't the very acceptance of Linux going to be indicated by all 3 of those being eroded ?
Can you imagine your parents running Windows ? Can you imagine them running Linux ? Can you imagine them compiling under a store-bought Linux distro and a "just slap in the CD" install ?
Re:Maybe I'm paranoid BUT (Score:4)
The people and pizza hut have been pissin' me off lately. Anyone know of a virus that will access a users modem and call pizza hut and order a bunch of pizza to people that don't exist?
The Pizza Virus effect could be great for alot of people. 1) More wasted food means better prices for farmers. 2) More wasted food means more work for sanitation workers. 3) Somebody might be thinking "hey, I want a pizza" and suddenly, the pizza virus will unexpectedly deliever a pizza to their door. I guess the people at pizza hut wouldn't like it much, but they are bastards anyway, so screw them.
Linux virus #1 (Score:5)
Oh good, we can all relax now (Score:5)
This is a pretty bad article IMHO. It is clearly meant as a rebuttal against what Garfinkle [slashdot.org] wrote. But it is pretty bad.
For a Linux binary virus to infect executables, those executables must be writable by the user activating the virus. That is not likely to be the case. Chances are, the programs are owned by root and the user is running from a non-privileged account. Further, the less experienced the user, the lower the likelihood that he actually owns any executable programs. Therefore, the users who are the least savvy about such hazards are also the ones with the least fertile home directories for viruses.
This describes the typical Unix situation, which is not the typical Linux situation. There, more people have installed their own system and have root priviliges. And the less savvy the user, the bigger the chance that the root user is the only account on the system.
Linux networking programs are conservatively constructed, without the high-level macro facilities....
Very true, but seconds later
Linux applications and system software is almost all open source. Because so much of the Linux market is accustomed to the availability of source code, binary-only products are rare and have a harder time achieving a substantial market presence. This has two effects on the virus. First, open source code is a tough place for a virus to hide.
Yeah right, so first it says that high level scripts may be a source of viruses, but then when you have source code (in e.g. Makefiles, highlevel), viruses are all of a sudden less likely. I am still afraid that I come into a Makefile someday that holds the line:
install: rm -rf /
Is this not a virus? If not, why is it a virus if a similar line is contained in some malicious Word macro?
No reason to worry about Linux viruses yet, but mostly because the platform is not popular enough to have a widespread effect (and this is the real lesson of zoology, viruses in nature are mostly used by evolution to limit large populations. This is why there are mostly Windows viruses; evolution wants to limit its growth).
App virii and hubris (Score:5)
Open source kills bugs DEAD! But folks who insist on distributing compiled versions of their code apparently do not want the advantage of infinitelly shallow bugs, and virus protection to boot.
The article points out that access protection keeps a virus confined within the user(s) that initially bring it onto the system. As Linux becomes more and more popular, new users running as root will multiply, making the installed Linux base more prone to virus infection from compiled wizz-bang apps that newbies will download.
New users may run as root because they don't know any better. They don't have to learn about access protection, chmod, or other UNIX complexity.
rm -rf works and there's no doubt, when you run as root.
Slightly less than new users run as root for the illusion of competency. This is where the danger lies. Arrogance is harmful until you have the experience to ack it up. Then it becomes confidence, and pride no longer requires running as root always, just to tweak a config file sometimes.
For the record, Linux DOES suffer from one virus. GPL.
Re:What Viruses are out there? (Lookee here!) (Score:5)
http://virus.beergrave.net
it's owner has several interesting (low-level, assembler/C, ELF) documents with linux virusses and descriptions. Find them here:
http://www.big.net.au/~silvio
Also, there's a linux virus at
http://www.mixter.org
For more low-level linux stuff go to
http://hculinux.cjb.net
A word of warning... (Score:5)
of course I'm all for writing about virus warnings, technical consideratiosn and the sort, but, IMHO, we must keep our tone down and speak with humility. Not even suggest for a minute that a successful linux virus is not possible. The ability of humans to do the impossible is a big part of the reason why linux exists, and to be honest, i started using linux BECAUSE most people (used to) think it would fail.
i personally think the open source movement, and the whole linux fenomena, is a serious and professional one, and unless treated that way will probably fall for the same reasons other venues are falling today (that is if you, like me, think that windows won't last that long). If more serious consideration would have been given to viruses when they first showed up (not mainstream), windows would probably be much more protected against them than it is (but then again, maybe not. thanks bill).
anyway, that's just my $0.02