Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Security Linux

First Botnet of Linux Web Servers Discovered 254

The Register writes up a Russian security researcher who has uncovered a Linux webserver botnet that is coordinating with a more conventional home-based botnet of Windows machines to distribute malware. "Each of the infected machines examined so far is a dedicated or virtual dedicated server running a legitimate website, Denis Sinegubko, an independent researcher based in Magnitogorsk, Russia, told The Register. But in addition to running an Apache webserver to dish up benign content, they've also been hacked to run a second webserver known as nginx, which serves malware [on port 8080]. 'What we see here is a long awaited botnet of zombie web servers! A group of interconnected infected web servers with [a] common control center involved in malware distribution,' Sinegubko wrote. 'To make things more complex, this botnet of web servers is connected with the botnet of infected home computer(s).'"
This discussion has been archived. No new comments can be posted.

First Botnet of Linux Web Servers Discovered

Comments Filter:
  • by gmuslera ( 3436 ) on Saturday September 12, 2009 @02:28PM (#29399885) Homepage Journal
    "With about 100 nodes". The average windows botnet (at least the one that make into the news) have from hundreds of thousands to millons of nodes. Not sure how "automatic" was the creation of this botnet, or how much at risk are generic linux users. Considering how are installed some and how careful are some admins about "security", is not amazing that a few out there could be rooted.

    In fact, if those servers already had apache, and some old vulnerable web application that enables somewhat transfer and execute binaries, in no recently patched kernels 2.4+ there are ways to escalate priviledges and get root to install what is needed. But probably normal users using modern distributions or admins caring a little about security are safe.
  • Reporters Fail (Score:5, Informative)

    by 99BottlesOfBeerInMyF ( 813746 ) on Saturday September 12, 2009 @02:34PM (#29399925)

    The only part of this article that is news is the part that is incorrect. Botnets of Windows machines often have compromised Linux servers working as a control channel or update channel. It is not at all unusual. What would be unusual would be for a worm or virus to actually compromise Linux machines in an automated fashion and make them bots. That does not seem to be what has happened here as the Linux systems seem to have been manually hacked in a normal, directed attack.

    Basicaly, nothing new or newsworthy happened here, except someone mistakenly referred to the compromised Linux servers as bots.

  • by Timothy Brownawell ( 627747 ) <tbrownaw@prjek.net> on Saturday September 12, 2009 @02:36PM (#29399937) Homepage Journal

    This isn't technically a botnet: [...] These are simply rootkitted servers and they appear to have been done manually. The unique aspect of this is that it seems to be coordinated,

    Which is what makes it a botnet.

    so the MS astroturf team has decided to call it a "botnet".

    "define: botnet" [google.com] ... I see nothing in there that precludes manually-compromised systems.

  • by KDingo ( 944605 ) on Saturday September 12, 2009 @02:52PM (#29400063)

    If your customers put up vulnerable software on your shared, dedicated, or virtual hosting service and they don't update it or you don't detect it, someone's going to find it and exploit it.

    Had something similar happen to my me. If you're monitoring server load, a webserver sending spam will definitely raise an alarm. As for services on odd ports, block everything except the real ports. Blocking outgoing traffic on IRC ports helps too in minimizing damage. The script kids are already making use of the recent Linux local root exploit (wunderbar_emporium) so make sure you do some yum updates!

  • Re:Reporters Fail (Score:3, Informative)

    by c6gunner ( 950153 ) on Saturday September 12, 2009 @03:39PM (#29400405) Homepage

    We just know that it exists... Which to me, is news.

    It shouldn't be. Or, at least the general concept shouldn't be. The original IRC bots were written to run on *nix, because they were meant to be used for channel control/moderation, and so needed to run on an always-on server. Which usually meant a shell account on a linux or BSD machine. Small channels only employed one bot, but larger ones used several working in tandem. So, really, the earliest bot-nets were all *nix based - they just weren't malicious.

  • by Anonymous Coward on Saturday September 12, 2009 @03:43PM (#29400437)

    Look at it this way. The servers in question form a network of bots. I don't think that is in any way debatable. Botnet seems to be a good shorthand version of "network of bots". Hence, the servers in question form a botnet.

  • by mysidia ( 191772 ) on Saturday September 12, 2009 @03:47PM (#29400455)

    No. Manually compromising servers and setting up nginx on them to serve files does not make it a botnet. "Botnet" or not has nothing to do with infection vector.

    It refers to compromised machines that have a certain 'intelligence' so that they form a network of their own, and allow the botmaster to easily deploy new instructions to them all. And all bots will execute the new instructions automatically.

    Manually compromising servers and installing a tool that causes all those servers to rendezvous with or receive commands from a central control point to execute instructions would make them a botnet.

    The key question would be: do the compromised servers also run a program that periodically polls a control station for commands, or does the script kiddie manually command individual compromised servers?

    If the servers only run nginx to serve files, or just periodically pull new files to serve from other servers (even a central one), then no, they're not a botnet, even if they've been backdoored so the blackhat can come back later and upload new malware files.

    To be a botnet, there must be a button where a botmaster can deploy instructions or code to a control point, and the nodes will automatically perform the instructions directed.

  • by mysidia ( 191772 ) on Saturday September 12, 2009 @04:00PM (#29400523)

    Botnets do not have to be self propagating. The very first botnets were on IRC.

    Where in fact, the machines weren't compromised. The owners of the machines actually ran the code (commonly Eggdrop) and voluntarily joined their bots to the botnet. They weren't even malicious.

    The term "botnet" does not imply a network of compromised hosts, or even malware. It refers to a network of robotic agents that are in communication with each other.

    Botnets were commonly used to form shared "party lines", to allow people to DCC CHAT their Eggdrop bots and communicate with people visiting from other channels, and other IRC networks.

    At first, these were used only for communication, people joined the botnets to chat with each other, there was no way to control other bots.

    At some point, some of the botnets got pretty large...

    Some of the botnets had a feature where a trusted "bot owner" or "bot master" as they were called, could be made "botnet admins" by bots they were peering with... allowing these botnet admins to command other hosts to do certain things on IRC

    Some botnets had member nodes run scripts that were able to do things like pingflood a user off IRC.

    This would be commonly used if some bad boy had taken over a popular channel. Ping flooding a user off IRC is undesired by the victim, but one time, it may have been used to encounter other hacking techniques the "victim" of the flood had been using to sabotage IRC channels.

    At some point, some IRC botnets started getting formed whose sole purpose was to flood.

    Eventually the term escaped IRC... other types of botnets started forming like Peer to Peer ones, smart ones that automatically added nodes (instead of two botnet admins deciding to interconnect), and botnets whose sole purpose was to accept commands from a central point.

    But the point is, the notion of a "Bot" and a "Botnet" has an origin that causes the term to not imply self replication.

  • by corychristison ( 951993 ) on Saturday September 12, 2009 @04:07PM (#29400571)

    Absolutely. It also mentions that they were FTP passwords. FTP is all in cleartext, no encryption or obfuscation.

    There is SFTP. But I don't know many providers that offer it. I avoid FTP in all cases and use SSH and SSHFS to talk to and transfer files to and from my servers.

    I also use Linux on my home machines (including my laptop).

  • by corychristison ( 951993 ) on Saturday September 12, 2009 @04:15PM (#29400607)

    Actually, the article says that FTP passwords were used. Meaning they were probably sniffed either on FTP Users personal computer, or over the wire somewhere between the user and the server on one of the hops, which could be dangerous.

    Moral of the story, use SSH!

  • by Atlantis-Rising ( 857278 ) on Saturday September 12, 2009 @04:46PM (#29400793) Homepage

    Did you seen 'often' in the definition? 'Often' =/= 'always'.

    The definition you yourself presented suggests that a botnet can be formed of automatically spread programs but does not have to be.

    Moreover, there is no part of the term 'bot' that suggests it requires automatic propagation. I have an IRC bot running right now. It does not go out and spread itself. It is merely a mechanical/electrical agent which operates autonomously in response to higher-level commands from me- just like any robot.

  • by mcrbids ( 148650 ) on Saturday September 12, 2009 @05:04PM (#29400893) Journal

    Back around 2001, I found a "botnet" comprising a perl script that ran on websites. Because it ran as a child of Apache, it showed up as "http" in ps. It would log into an IRC server, and wait for commands which appeared to be little more than arbitrary bash commands that were shelled out.

    Bone-headedly simple. Ran well on any unix website host running perl scripts, installed via an insecure formmail.pl script. I penetrated the IRC network and watched for a few hours while the operator attacked a few hosts. There were some 50 hosts or so. Then I killed the script and updated all copies of formmail.pl hosted on the server...

    Is this new news?

    What's next? "Hammers can be used to smack things, even if they aren't nails." !?!?!

    Truth is this: no operating system is 100% secure. But this "botnet" isn't necessarily even a compromise of the Operating System! Port 8080 is above 1024, so non-root controlled processes can open sockets there. This may be nothing more than something like the perl script I mentioned and having nothing to do with the Operating System in question. The server wasn't compromised, just a bad script was running that had to be deleted, then killed with an Apache restart.

    Given the parameters I just mentioned, there isn't an Operating System around that would stop this from happening. It's just that the "Mom's basement" fanbois get all riled up because it's gospel that Linux is immune to $allBadThings.

  • by Anonymous Coward on Saturday September 12, 2009 @05:10PM (#29400943)

    In any case, yes, it absolutely has to be a network robot to be a bot, and those are by definition automatically spread, not manually propogated. That's the "bot" part of the term network robot.

    I am sorry, in this case you are wrong. The "bot" in botnet means that there are a lot of robots in a network doing some kind of coordinated task.

    So it is not the propagation method but rather how it works that is refered to when calling it botnet.

    Of course, the preferred way to set up a botnet is using some kind of automated approach since you want numbers for it to be effective and that is usually achieved easier with automated attacks like virii or trojans rather than old fashion sniffing and cracking.

  • by Anonymous Coward on Saturday September 12, 2009 @05:48PM (#29401141)

    I can't imagine how you came to the conclusion that the fault was with *apt* of all things.. did you think it works by magic? Blame the Debian "It's not moldy, so it's not for us" maintainers instead, or even yourself for using a distribution known to ship ancient software no longer supported by upstream.

  • Re:Reporters Fail (Score:2, Informative)

    by UnderLoK ( 552056 ) on Saturday September 12, 2009 @05:54PM (#29401177) Homepage Journal
    100% agree man, I was at ThePlanet and Rackshack and Rackspace before that and at each one of those hosts it was a constant to have tons of boxes on your network brute forcing because they had already been rooted. Granted this wasn't just brute force, they would often exploit holes in SSH, Apache (being the most common I would say), and similar services. I find this article suspect because I know full well these boxes (mine too at one point) were part of choreographed DOS attacks perptrated by an individual. This was in 2000 or 2001 and until I quit running boxes in those locations in 2005 I continued to see it happen.
  • by Bigjeff5 ( 1143585 ) on Saturday September 12, 2009 @11:12PM (#29402645)

    It's been true since after 2000.

    Granted there have been some remote code execution exploits, but the number of those is miniscule compared to someone with a poorly configured box clicking something they shouldn't have clicked, and then saying "yes" when the thing they shouldn't have clicked wanted to install something they shouldn't have installed.

  • Comment removed (Score:3, Informative)

    by account_deleted ( 4530225 ) on Saturday September 12, 2009 @11:43PM (#29402749)
    Comment removed based on user account deletion
  • by vginders ( 521915 ) <sergeNO@SPAMvanginderachter.be> on Sunday September 13, 2009 @06:32AM (#29404071) Homepage

    Ngix or whatever it's called is clearly a bot,

    It's called Nginx (http://nginx.net/) and it's a well known HTTP and proxy server.

    any program that recieves input and performs a task fits that definition

    Isn't that also some kind of definition of every networked software?

  • Use the source, Luke (Score:4, Informative)

    by petrus4 ( 213815 ) on Sunday September 13, 2009 @08:28AM (#29404475) Homepage Journal

    Let this be a lesson to everyone who reads the article. Security is not something that happens by accident.

    I've said for a long time that binary packaging is, fundamentally, a Hell-spawned abomination masquerading as a convenience; incidents like this only prove the point.

    Compile yourself a minimalistic base system, a la Hardened Linux From Scratch [linuxfromscratch.org].

    Then get the absolute minimum number of packages you need for a working system, such that you've got some chance of keeping them updated. Firefox for web browsing, maybe. A single media player; VLC or Xine. Vim/Emacs as an editor. OpenOffice.org if you need that. Whatever servers you need, but keep that list small. A firewall, which is hopefully obvious.

    Use a minimal window manager which doesn't have a dep list as long as your arm, as well. I use Ratpoison. Do not laugh until you've tried it. It is very, very fast, and resource consumption is virtually nil. It's basically an X version of GNU Screen.

    Once you've got this small list of packages, take full, ruthless, practical advantage of the fact that your system is open source. Subscribe to the announce or bug related mailing lists for the apps you've got, and keep local virgin tarballs. This way, whenever there is a bug or potential exploit, and the patch gets posted within a few minutes or hours, you can get it the moment it goes to CVS, patch your own source tarball, and recompile. The same goes for the kernel itself.

    You won't be vulnerable to exploits, because you'll get the solutions to them as they are implemented, and you're also far less likely to end up with a compromised machine as a result.

    Brainless Windows refugees, who will sneer at me, and/or complain about how this isn't, "user friendly," don't even bother. This post isn't for you. We already know that you've committed yourselves to being servile, unthinking sheep, and you are therefore invited to accept the consequences of your (lack of) actions in that regard.

  • by Wuhao ( 471511 ) on Sunday September 13, 2009 @12:14PM (#29405487)

    Not to pressure one too much, but automaton is rooted in Greek, not Latin. :)

Kleeneness is next to Godelness.

Working...