Please create an account to participate in the Slashdot moderation system


Forgot your password?
Security Linux

First Botnet of Linux Web Servers Discovered 254

The Register writes up a Russian security researcher who has uncovered a Linux webserver botnet that is coordinating with a more conventional home-based botnet of Windows machines to distribute malware. "Each of the infected machines examined so far is a dedicated or virtual dedicated server running a legitimate website, Denis Sinegubko, an independent researcher based in Magnitogorsk, Russia, told The Register. But in addition to running an Apache webserver to dish up benign content, they've also been hacked to run a second webserver known as nginx, which serves malware [on port 8080]. 'What we see here is a long awaited botnet of zombie web servers! A group of interconnected infected web servers with [a] common control center involved in malware distribution,' Sinegubko wrote. 'To make things more complex, this botnet of web servers is connected with the botnet of infected home computer(s).'"
This discussion has been archived. No new comments can be posted.

First Botnet of Linux Web Servers Discovered

Comments Filter:
  • Re:Reporters Fail (Score:4, Interesting)

    by 99BottlesOfBeerInMyF ( 813746 ) on Saturday September 12, 2009 @02:58PM (#29400103)

    Well, you are assuming that calling a machine a bot is dependent on the fact it was infected.

    Not really. Calling a machine a bot or zombie is generally an indication that they are the regular "peon" part of a botnet. I mean technically the control channel and update channel and the terminals machines the operator is using are part of the botnet. They just are not generally referred to as bots because they are part of the system doing the controlling instead of being the end systems used to launch attacks.

    My main point was, the summary and title here led readers who use the specific terms one way to think that is what was happening. The comments from researchers led people to think that. That is why this was news. It's not news to discover Linux systems hacked by hand are being used to control Windows bots, because that happens all the time and is, perhaps, the most common kind of botnet.

  • by eln ( 21727 ) on Saturday September 12, 2009 @03:07PM (#29400167)
    A Windows machine being run by someone who cares about security and updates it regularly won't end up in a botnet either, so I'm not sure what your point is.
  • by CAIMLAS ( 41445 ) on Saturday September 12, 2009 @03:33PM (#29400379) Homepage

    Really, this is a pretty trivial "jump" from the normal way of things.

    You've got manually installed rootkits, and most of them have C&C tools. How is this much different, other than optimizing the C&C mechanisms? There's nothing here to suggest this is anything "new": the mechanisms, whatever used, still appear to be tightly constrained to "manual rootboxing" - a time consuming process compared to a "real" automated botnet.

    All evidence points to this being more of someone's "pet" botnet than it does any sort of improvement on the malware concept. Same old thing, different implementation. Let me know when there's a polymorphic, multi-OS botnet with a non-distributed model and pluggable payload and vector - which uses traffic heuristics to hide its traffic on a network and runs "quiet" (compared to common botnets/worms). Then I'll start being concerned.

  • by bjourne ( 1034822 ) on Saturday September 12, 2009 @04:00PM (#29400525) Homepage Journal
    Well, it seems that stupid people [] actually [] *build* [] linux [] too []!
  • by drougie ( 36782 ) on Saturday September 12, 2009 @04:06PM (#29400565) Homepage

    It's nice to be able to apt-get yourself the latest stable copy of apache2 and php5 and mysql and postfix humming with just a command or two, also nice to be able to apt-get upgrade them after you apt-got updated. Those who maintain, clean and contribute to the large public repositories that apt and yum and rpm and pkg_add, good people and they generally do a bang up job for 99% of the Linux and UNIX and UNIX-like folks. However, when you maintain servers which are not completely hidden behind a nat with these programs for years and once in a blue moon compile something you downloaded in a gzipped tar, you put yourself on admin autopilot and that can bite you in the ass.

    Give you one example: I installed RoundCube, the most badass webmail client there will ever be, ever, with apt (the first time). Ran it for a while without incident. Had my system on weekly cron apt updates so I figured I was safe. Eventually I discover someone made it onto my system and put a malware installing js line in my web pages. Looking through the guy's bash history I discovered they got in through a RoundCube vulnerability. I checked out RoundCube's site [], something I should have done first thing but did not, and it turns out their stable version was much newer than what apt realized and that this vulnerability would not have been on my system about five months ago had I downloaded straight from their site and stayed on the ball with their support resources which are things that are less necessary when you just let apt-get rip.

    Bottom line, apt-get update/upgrading would not patch a glaring vulnerability in software I found with apt originally with the default Debian sources.list and I doubt it would have on most other distros' package management systems. It wasn't RoundCube's fault, the patched release was their Stable build for a long time but I was left wide open to anyone who went on a rootkit site and googled for roundcube hosts and I got nailed. Learned my lesson and I don't fault the repository maintainers for being behind the ball a bit on less popular software in their enormous archives but if you ask me software should not be available on the default repositories for Linux variants that the maintainers are not confident that they can keep up to date or don't have some kind of way to be quickly and effectively notified by the authors/vendors in the event of a critical upgrade being available and to put it live right quick. Put it on the people who want to install such software themselves -- if they can make it past that hump I'd say their odds of running the software safely will be substantially higher than Joe Yum. And spreading awareness of cvs/svn would be nice too.

    Can't believe I just admitted I got compromised.

  • by bbernard ( 930130 ) on Saturday September 12, 2009 @04:39PM (#29400747)

    Absolutely! There's plenty of stupid to go around.

    1. Where was the firewall admin to prevent external systems from connecting to these webservers over port 8080?
    2. Why did the admins use insecure tools or insecure systems to allow their credentials to be sniffed?
    3. Where was the IDS/IPS to notice the sudden change in traffic?
    4. Where was the load balancer/reverse proxy to intecept this junk?
    5. Where was the routine review of logs to notice the dynamic DNS updates from computers with (presumably) static DNS entries somewhere?
    6. Where was the periodic pen/vulnerability test against these systems?

  • by asdf7890 ( 1518587 ) on Saturday September 12, 2009 @05:06PM (#29400915)

    There is SFTP. But I don't know many providers that offer it. I avoid FTP in all cases and use SSH and SSHFS to talk to and transfer files to and from my servers.

    I also use Linux on my home machines (including my laptop).

    SSHFS will most likely be using SFTP, or SCP. While you could do the work that SSHFS does with clever redirection of stdin and stdout it would be more complex and error prone than just using SFTP or SCP which are both usually implemented as subsystems of SSH and are provided by many SSH servers unless explicitly turned off (so if your provider gives you SSH access, that chances you have SFTP and SCP access too are high).

  • by thejynxed ( 831517 ) on Saturday September 12, 2009 @06:10PM (#29401259) Homepage

    There is the argument to be made that plain-text passwords should never be allowed to begin with, nevermind which platform, 3rd-party software, or hardware architecture that a system is comprised of.

    That being said, there could be just a wee tad bit of blame laid at the feet of the programmers of the software/hardware for allowing this to be possible in the first place.

    Hindsight is so useless :P

  • by drougie ( 36782 ) on Saturday September 12, 2009 @06:30PM (#29401367) Homepage

    Firstly, it's my fault for running a webmail client I got from browsing through apt-cache, installed with apt-get and configured mostly with dpkg-reconfigure instead of grabbing the official current build and reading the readme and man pages and faq, and doing this on a somewhat important machine. Did the same thing with Gallery [] and PHPNuke [] several years ago. Even webmin in my reckless and stupid experimental days. That's painting a target on yourself to get malware on your sites and start running irc bots or worse. Have you looked at some of these rootkit sites? Disturbing how finding and proliferating vulnerabilities in Linux, not just MS, is a full-time hobby/living for so many people. Then you install something like snort from apt-get thinking Yeah I'm on top of my security now, but you have no idea that you're using a six month old release of software with a demo package of ancient rules when it needs heavy configuration that dpkg doesn't handle and fresh rules with a subscription and a key in the right place to be effective.

    That said, yeah, Debian's reputation for waiting a ... conservative amount of time to make new releases of various software available on their repositories, whether it's gimp or gaim or kde or nmap, maybe I assumed that that behavior of deliberately (?) waiting a little while longer than the rest of the world to catch up to the developers' latest releases for the sake of not releasing anything that may contribute to snafus, that Debian's actually doing what's best for me. Maybe my roundcube adventure was anomalous. Regardless, I love Debian, I certainly love apt (so much I just tried Debian KFreeBSD to hang onto apt). By naming the package management systems of the other distros/OSs I was trying to suggest another point that Linux is becoming too easy. Lower learning curve, more people who may make my mistake and surrender their machines to China, Russia and 4chan by installing the wrong package.

    It would be great if apt had svn/cvs behavior embedded into it to somehow investigate whether or not everything on your system is up to date by logging not just onto Debian's repositories but to servers maintained by developers. Can't expect apt to then install the next version but just to let me know what it found so I could deal with it myself. Maybe such a thing already exists -- guess I should apt-cache search it. :P

  • by Darkk ( 1296127 ) on Saturday September 12, 2009 @07:30PM (#29401675)

    Not entirely true about Linux server. When I build them I usually install Webmin which allows me to manage the server via web-gui. Yes I know if I were a real linux geek I'd do everything in command line but when there are times I can't remember the proper CLI sequence it's easier just use the web-gui.

    Recently I built a linux webserver with RAID 5 drives. I've read the docs on how to create a RAID 5 array but that took awhile. When I installed the raid 5 module into Webmin I did it in 5 mins. I've also set the iptables rules to only allow certian IPs to access the webmin and changed the ports.

    So there are options for those who want some kind of a GUI. CLI is also suspectable for those who don't pay attention to the modules being installed on the linux server, i.e. untrusted programs. etc.

  • Re:Reporters Fail (Score:3, Interesting)

    by 99BottlesOfBeerInMyF ( 813746 ) on Saturday September 12, 2009 @09:48PM (#29402273)

    'Botnet' has never meant 'auto-infected' and if they assumed that, they were careless.

    No, botnet means a network of computers auto controlled, but in general when you describe a botnet, especially referring to the OS, you refer to the OS of the bots, which make up the majority, not the OS of the select few control channel systems.

    The summary makes no attempt to fool them into thinking anything other than the facts.

    The title was, "First Botnet of Linux Web Servers Discovered". It didn't say first botnet of Windows machines controlled by ten Linux Webservers. It isn't the first botnet that includes Linux Web servers, those are actually quite common. Thus the average person who knows what they're talking about assumes it is the regular bots which must be running Linux, since otherwise the title makes no sense. You don't think that is misleading?

    Besides which, at this point, we don't -know- how it spreads. We just know that it exists... Which to me, is news.

    Well, no we don't know for sure, but we do know what is likely. Given that only a few servers have been hacked and given the nature of the attempt, it seems to be targeting server operators who attempt to FTP files and steal passwords, it is probably just dumb admins who don't verify credentials and who use root for FTP operations. In past servers have been compromised by a number of web server exploits as well. The only thing that differentiates this botnet from any other is that they networked the control channel to load balance the phishing server. Aside from that, nothing about this botnet is even out of the ordinary. To call it the first botnet of Linux servers is disingenuous by every definition of a botnet that doesn't count multiple Linux Web servers controlling a bunch of Windows boxes but does include multiple Linux Web servers that control a bunch of Windows boxes, but randomly pass traffic to each other to balance the incoming connections.

  • by Bill, Shooter of Bul ( 629286 ) on Sunday September 13, 2009 @02:42AM (#29403299) Journal

    I was involved in investigating a compromised linux based web server. basically it all went down like this:

    They hired a stupid guy to install some adbanner to their site. His windows based computer was infected by several viruses. He downloaded the full site, at which point the virus inserted some malicous code in the websites code. He reuploaded the whole thing, and bang the website was infected. It was still infected, but Windows was the primary infection vector combined with the stupidity of the computer owner.

  • by Matz0r ( 324905 ) on Sunday September 13, 2009 @06:42AM (#29404105)

    Manually compromising servers and installing a tool that causes all those servers to rendezvous with or receive commands from a central control point to execute instructions would make them a botnet.

    The key question would be: do the compromised servers also run a program that periodically polls a control station for commands, or does the script kiddie manually command individual compromised servers?

    I actually encountered this a few years ago, a Red Hat box had been carelessly placed on the internet with a poor dba username password combo. The attacker had not gained root access. But he did manage to install zombie software on the computer in /var/tmp, which consisted of a small web-server serving malicious code and a custom ssl-irc client configured to connect to the botnet owners irc server.

    Curious, I took a copy of the software he had installed before I wiped the server. I then proceeded to connect to his irc server using the credentials found in the zombie software. I ended up in an irc channel with the actual owner of the botnet sitting there. Because I kept my servers original irc-name he started prodding me with dcc-commands to find out the status of his returning zombie. After a while I responded and told him he had been discovered, we had a brief chat before he banned me from the irc-server. Seemed like a script kiddie, he used "LOL" in every sentence and lots of numbers, the net seemed to be run manually with some 30 "clients" in it. I gave his client IP to his ISP in Romania together with the logs, doubt anything came out of it though.

"The following is not for the weak of heart or Fundamentalists." -- Dave Barry