Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Red Hat Software Businesses Security United States

Red Hat Advanced Server Gets DoD COE Certification 186

DaveAtFraud writes "CNET is reporting that Red Hat Advanced server has been certified as a 'Common Operating Environment' (COE) when running on an IBM server by the U.S. Department of Defense. Red Hat Advanced Server is the first version of Linux to receive this certification. The certification clears the way for broader use of Linux in governement computer systems. Its interesting to note that the certification effort was made for the more proprietary (and costlier) Red Hat Advanced Server and not the basic Red Hat distribution." This despite the best efforts of certain lobbyists.
This discussion has been archived. No new comments can be posted.

Red Hat Advanced Server Gets DoD COE Certification

Comments Filter:
  • could anyone who knows their stuff about redhat tell me the level of security it's got in relation to other distros and OSes ?
    • Re:Security? (Score:5, Interesting)

      by terraformer ( 617565 ) <tpb@pervici.com> on Wednesday February 12, 2003 @08:33AM (#5286920) Journal
      Well for example, I just installed the latest Mandrake distro and any service I installed was turned on by default. In RH 8.0 you can install any service/package available but nothing is enabled unless you choose to after install. That is one of the cornerstones of security. Only turn on what you need. Just because I want something installed does not mean I want it turned on right now. I should not have to remember to go through and turn off everything just to have stuff sit on my drive until I am ready to configure and harden it.

      Anyhow, all these distro's really have in common is the kernel code which makes them linux. The rest of the software (FTP, wm's, editors) bundled is up to the bundler. It is these choices that can make a distro more secure from another. EX: ssh v. telnet, std ftpd v. vsftpd, vi v. emacs (Sorry, I just had to ;-}) et al; The DOD is going to certify the whole bundle and not just individual pieces. Basically, they don't trust their admins (contractors mostly) to pick the right pieces on their own, so they will find a good bundle and certify that with special instructions.

      • by Anonymous Coward on Wednesday February 12, 2003 @08:47AM (#5286998)
        vi v. emacs
        I thought Linux could only address 4GB of memory. If this is the case, how is it that emacs can run on one of these computers? What changes did Redhat make to allow this to occur?

        Thanks in advance.
        • http://www.gnu.org/software/emacs/emacs.html#Platf orms

          and

          ftp://ftp.redhat.com/pub/redhat/linux/8.0/en/os/i3 86/RedHat/RPMS/emacs-21.2-18.i386.rpm

        • Redhat switched to the 2.4 kernel which can address 64GB of memory. So now you only need 4GB of swap space to run emacs :)
      • Re:Security? (Score:3, Interesting)

        by Judebert ( 147131 )
        I work on a government project using, and interfacing to, DII COE machines. They're all Sun/Solaris, and when you install the DII COE software you actually modify the kernel. Other DII COE software (called "segments") depends on the DII COE kernel. So I doubt Linux is going to get any real play for a while.

        As far as security goes, I doubt the government will worry much about the bundled software; they generally disable everything they're not interested in and install their own segments for the functionality they need. While that does mean that the production systems probably won't have my favorite applications (because they haven't been ported to DII COE segments), at least my development systems can have what I want and still closely match the production systems. Heck, I could even develop at home.

        That said, getting *any* version of Linux certified is great for me. I expect most of the Solaris segments will run with very little modification, so my development environment can very closely match my production environment. An the performance benefits I get from running on x86 hardware -- not to mention cost benefits -- will be phenomenal. (Given the recent revelations concerning Java and Solaris, [slashdot.org] running under a different OS is welcome as well, since a large part of our software is affected.) I might even get to use bash! And vim! (And emacs, for the heathens. Or your editor of choice.) And gcc!

        I expect Linux will win its place in the DII COE hierarchy, and sooner rather than later. In fact, at least one very important DII COE segment is already adding Linux support. My job is about to get a whole lot easier.

      • That deal with all services being on by default is kinda misleading. The reason is that several steps later in the install you get a list of the possible startup services, most set on, and you can now deselect the ones you don't want running. If someone just blows by this very obvious screen, then yes, selected services will be running upon reboot.


        Kinda hard to miss this screen however. I always turn off everything except sshd, postfix, and a few other nice services.

    • could anyone who knows their stuff about redhat tell me the level of security it's got in relation to other distros and OSes ?

      It is more than just a security concern to become a Common Operating Environment. Coding custom applications is always risky business because the OS can be a moving target. Coding custom to Linux can be nightmarish because it is not "a floating target" but rather "multiple boagies". RH Advanced has a feature freeze to the distribution, every RHA Server has the same hooks and APIs as every other that more than anything is what did it for them. RH will not contact you next week and tell you that they changed their mind on the kernel version etc...instead in a year or so you might get RHA 2.0.

      For development purposes this is good news.
  • by Anonymous Coward
    It is near impossible to use most open-source in a cost effective way under those regulations. Give it a read and and then move onto their understanding of software verification.

    The whole open-source model just don't fly.
  • The Notion (Score:1, Funny)

    by Anonymous Coward
    The obvious notion: "Not that I really care about military level security for my home computer, but it would be kind of cool to have."
  • by jht ( 5006 ) on Wednesday February 12, 2003 @08:03AM (#5286793) Homepage Journal
    Its interesting to note that the certification effort was made for the more proprietary (and costlier) Red Hat Advanced Server and not the basic Red Hat distribution

    Why is this even worth noting? Certification efforts aren't especially cheap. If you're going to expend time and resources getting a version of your product certified, why not put the effort into the version that is likeliest to generate enough revenue as a result of the certification to pay for the effort.

    After all, while RedHat is in relatively good financial condition, it's not like they have around $40 billion in the bank (unlike some operating system companies). Certifying Advanced Server is a good use of limited resources.

    That said, any government security certification is a Good Thing in the commercial marketplace, too - it helps when the engineers need to make a positive case to their PHB's, and gives one more "checklist item" that can get marked in their favor when comparing RH to other vendors.
    • by Jim Hall ( 2985 ) on Wednesday February 12, 2003 @08:13AM (#5286846) Homepage

      Its interesting to note that the certification effort was made for the more proprietary (and costlier) Red Hat Advanced Server and not the basic Red Hat distribution

      Yes, it costs more. But it's about the same as (or less than) support & licensing costs for "big UNIX" like Solaris.

      I think it's incorrect to label RHAS as "proprietary". It's based on a Red Hat Linux boxed set, but I believe they bundle in software from partners.

      Each release of RHAS has a longer lifecycle (something like 14-18 months) so you don't have to upgrade every 6 months when the new Red Hat Linux comes out. You do get a "stepped-up" version of their Red Hat Network support, which we currently use on their boxed sets to stay up to date with erratas.

      • I think it's unfair to compare RedHat AS to Solaris. First of all, Solaris has gobs of system management tools, a kernel with many tricks up it's sleeve and a full UNIX98 compliance. And second, Solaris has a lifecycle of 11.5 years, while RedHat AS has only 3 year lifecycle.
        • Have you actually used RedHat? Having been a sysadmin for both platforms I can tell you that RedHat's stuff is a bit nicer because they tend to sift the best of the best from the OpenSource community.

          Besides, last I checked Sun was hawking Linux.

          • OH yes, I am using RedHat 6.2 and occasionally RH 7.1 every day for several hours, and I started using RedHat with RH 4.1.

            Having managed both environments for several years, I came to the conclusion that Linux in general has a broken development. RHAS will hopfully stabilize that. As for RH Linux, I like Slackware much better.

            (but I already said what I think re. Solaris vs. Linux)
            • Ouch I've stepped into that one.

              Allow me to grovel a bit and take my lashings for doubting your experience. You get so many folks who bitch about the one time they installed it on their mom's old 486 and try to extrapolate out.

              I too have some issues with RedHat. I personally don't use the graphical tools. I hack the config files with my own home rolled Tcl/Tk scripts. I just like having a stable and supported set of binaries to build on. That and what project these days doesn't put out a RedHat compadible RPM.

              Where I run into trouble is downloading the source and compiling it. I must have 4 different copies of Tcl installed on my system between the Tcl that comes with Linux, the development version I compile myself to write extensions, the version ActiveState puts out, and the somewhat self-contained one that is bundled with Tcl/Tk. (Not to mention a few other applications.)

              I have also managed to shoot myself in the foot with trying to do it myself on package management. I have an automation that downloads the patches, and distributes them to my linux cluster for installation. The problem is that a few of the patches have royally crufted my network.

              I also have to apologize for confusing Solaris with SGI. I have a bunch of O2's that are gathering dust because they are obsolete and a bitch to keep running. We have a pair of Solaris boxes for our Weather system and I have rather liked working on them.

        • by ewilts ( 121990 ) on Wednesday February 12, 2003 @01:09PM (#5289020) Homepage
          RHAS does not have only a 3-year lifecycle. It's 5 from initial release, based on this official document: http://www.redhat.com/apps/support/errata/rhlas_er rata_policy.html [redhat.com]

          Comparing that to Solaris, I have no idea where you pulled out the 11.5 year life cycle. According to Sun's web page, it's 5 years from last ship date. Reference this page: http://wwws.sun.com/software/solaris/fcc/lifecycle .html [sun.com]

          I will admit that 5 years from last ship is greater than 5 years from initial ship, but there's no way in hell it's an 8.5 year delta like you're trying to claim.

          Ya know, "gobs of system management tools" and "a kernel many tricks up it's[sic] sleeve" don't exactly add to much of a review :-). I believe I can honestly claim that Red Hat Linux Advanced Server has "gobs of system management tools" and "a kernel with many tricks up its sleeve". Of course, this claim holds true for Windows too.

          How you got moderated to 2 on your post is beyond me...

          • I did not see the document you have pointed me to, yet. I have read another one, that I guess has been updated since then, that implied a total lifecycle of 3 years (from GA to end of maint. support). According to the document you pointed to, it's indeed an interval of 5 years, and the following text spells that out most clearly:

            Red Hat Linux Advanced Server 2.1:
            General Availability: May 17, 2002
            Full Support (including hardware updates): May 17, 2002 -- November 30, 2004
            Deployment Support: May 17, 2002 -- May 31, 2005
            Maintenance Support: June 1, 2005 -- May 31, 2007


            However, the Solaris lifecycle, in the same terms (general availability to end of maintenance support) is 10 years [sun.com] which is twice the joy.

            I will admit that these terms look much more favourably on RHAS, though. Thanks for the link.
          • Solaris last ship date (LSD) happens sometimes after a second release after the version in question. Solaris releases come more than two years apart. That means, that typically a Solaris release ships for about five years. Add to that another five years promissed support -after- LSD, and you can see that most solaris versions are supported for at least 10 years since the first ship date.

            For example, Sun is still shipping Solaris 7 which first was shipped in 1998. At the LSD date of Solaris 7, it will be about five years since the first ship date.

      • It is also worth noting that you don't patch to the latest kernel with AS2.1. When RHAS comes out with a version (currently v2.1), it comes with a slightly patched kernel (of course, patches are available over the 'net in accordance with GPL). v2.1 currently comes with 2.4.9, but woah! 2.4.20 is out, right?

        RedHat might patch their 2.4.9 kernel to fix serious bugs, but they have only certified against the version that came out of the box.

        I've been using AS2.1 for several months now, and I haven't been disappointed. If anything, now my employer "has someone to sue" if the OS doesn't work right. Wasn't that part of the hesitation for larger corporations in adopting Linux?

    • by Pharmboy ( 216950 ) on Wednesday February 12, 2003 @08:18AM (#5286865) Journal
      Why is this even worth noting? Certification efforts aren't especially cheap. If you're going to expend time and resources getting a version of your product certified, why not put the effort into the version that is likeliest to generate enough revenue as a result of the certification to pay for the effort.

      After all, while RedHat is in relatively good financial condition, it's not like they have around $40 billion in the bank (unlike some operating system companies). Certifying Advanced Server is a good use of limited resources.


      Amen. Their "more expensive" verion is what makes them money, not the free version. Certification of Advanced server doesn't take away from the benefits of their downloadable version, or other distros in any way.

      If Linux is going to take hold, SOMEONE has to make money with it. People just miss the point: OS software is free as in speech, NOT as in beer. OSS doesn't mean everyone just walks around and works for free. It means programmers contribute code for "free", but make money when they support this code (and the code others contributed "free") to end users. When they add value to it.

      If the GPL did not allow anyone to make any money, in any way, we would not be here talking about Linux.

      • "OS software is free as in speech, NOT as in beer."

        Yes, that sometime is true of "open-source" software but free software [gnu.org] is free as in speech, AND as in free beer.

        • by Pharmboy ( 216950 ) on Wednesday February 12, 2003 @08:30AM (#5286912) Journal
          Yes, that sometime is true of "open-source" software but free software [gnu.org] is free as in speech, AND as in free beer.

          And it is not likely to ever get certified because there is no way to recover your costs.

          The point being made here is Linux being certified, making it more able to compete with Microsoft in the marketplace. The point isn't to argue over symantics.

          • It's only "unlikely", as you claim, if the US govt never changes the method of getting OS certification. Although it is currently expensive to get the certification for an OS in the US, the method may change and the costs may fall.

            In some other countries,there is no certification process to go through and OS software and free software are already used in applications which in the US would normally require certification.

        • Not true: from their site [fsf.org]: . . .

          ordering [fsf.org] manuals, t-shirts and especially CD-ROMs from the FSF. Most of the FSF's funds come from selling copies of things that everyone is free to copy.


          • Despite the fact that free software is sometimes sold, e.g. by the FSF, the Gnu General Public License [gnu.org] guarantees that free software has the legal property that nobody is allowed to prevent anyone from distributing any free software completely free-of-charge, even free software that is being sold by someone else.

    • As a Redhat shareholder, I certainly appreciate the decision to certify the more expensive system.
    • Its interesting to note that the certification effort was made for the more proprietary (and costlier) Red Hat Advanced Server and not the basic Red Hat distribution
      You're right, it does cost more....for the first copy! After that, it's free.

      However if you want support for it, it will cost you about $1200 per machine per year. This is cheaper than most other OS's.

      Personally, I think you would be better served developing in house resourcs for the support, but that's just me.

      I'm also not necessarily happy with RH's choices on some packages to include in AS. The one that jumps out at me is choosing to use a beta version of an ntp4 [ntp.org] release as opposed to simply using whatever was the stable version at the time.

      And yes, I work somewhere that is probably going to implement hundreds of copies of RH AS, and pay for the support.

    • by salimma ( 115327 ) on Wednesday February 12, 2003 @08:47AM (#5287001) Homepage Journal
      Not to mention that the certification is only valid for a specific version of the OS (what Microsoft neglected to say back when they were selling NT 4.0 was that it's NT 3.5 that is C2-certified).

      The Advanced Server is released every one and a half year or so - the desktop OS every six months. Personally I find it a very agreeable deal - the free users get faster releases and contribute towards bug testing, the paying customers get what they want, slower but longer-supported (and now certified too) releases.
    • Its worth noting because its fun to be somewhat of a troll when you post an article. It definitely stirred the conversation

      Oops. Did I say that?
    • by Anonymous Coward
      Do you realize how much money the military is spending on Solaris and NT systems to run their apps? Do you realize that the ONLY reason, in many instances, they aren't running Linux instead is because it has never been COE compliant/certified?

      This isn't about some moron IT guy in a green/navy jumpsuit deciding to use NT because he likes it, it's about the moron IT guy having no choice in the matter because he HAS TO deploy a COE compliant system.

      This is big news for Linux.

    • Its interesting to note that the certification effort was made for the more proprietary (and costlier) Red Hat Advanced Server and not the basic Red Hat distribution ... jht responds: Why is this even worth noting? Certification efforts aren't especially cheap. If you're going to expend time and resources getting a version of your product certified, why not put the effort into the version that is likeliest to generate enough revenue as a result of the certification to pay for the effort.

      You are too kind to these people jht! If they want the 40 dollar version certified, Slashdot whiners should start up a fund to PAY $$$$$$ for the certification (I am sure RedHat would be overjoyed that the Linux community would donate so much money to them) and while complainers on slashdot are about it, I would suggest a fund to get Debian certified too. DO I hear silence from whiners... I thought so.

  • by i_want_you_to_throw_ ( 559379 ) on Wednesday February 12, 2003 @08:07AM (#5286815) Journal
    I use it on a box to run apps that I developed that our M$ monkeys haven't matched(or can't) match. Mainly a lot of situations where one line of code does what would take several more in M$ (Scheduler vs. cron)

    In our case it comes down to services. I work for the Commanding General and all he wants is "services not platforms".

    I think maybe that has helped to bring in open source in our little corner of the military more than anything. IM talks about how they are M$ certified blah blah and I just bring out a new app coded in Perl that the green suiters can't live without.

    Or better yet create one and let it run on one of my own outside servers and then demo it to them with a "Oh by the way, we need Linux to do this".

    It's like heroin, get 'em hooked. They gotta have it. Superior services, not platforms.

    As far as it being the more expensive version of RH that's certified, have you seen RH's stock price? You're still saving the military a lot more in the long run by getting the more expensive version.
    • by syle ( 638903 ) <syle AT waygate DOT org> on Wednesday February 12, 2003 @08:37AM (#5286945) Homepage
      I use it on a box to run apps that I developed that our M$ monkeys haven't matched(or can't) match.

      ...I just bring out a new app coded in Perl that the green suiters can't live without.

      How do these things relate to Linux? No one's arguing that it isn't a good development environment, but perl runs in Win32 fairly easily.

      You say superior services, not platforms, but it sounds like you're taking programs that could otherwise be cross-platform using them to push Linux for its own sake. Or, are you doing something with perl that would tie it to Linux?

      (Ready to be modded into oblivion for implying that Linux should exist just for its own sake...)

      • by syle ( 638903 )
        shouldn't exist for its own sake. bleh.
      • Because linux (or any unix really) is a far superior application development and execution environment for the kinds of applications the Original Poster is probably talking about. The Unix toolset is *available* in some cases on NT (I've developed with perl et al on both platforms), but that should not be mistaken for it being *optimized*. Further, Unix/Linux is far easier to admin (speaking as someone who has admin'd both), has lower hardware costs (for x86 Unix), and generally higher throughput. This is of course leaving aside that Unix is *far* easier to secure than NT...

        Don't mistake me for a Unix zealot, if MS came out with something better I'd use it in a heartbeat. But I live in the real world, and I solve real problems under real time and budgetary constraints. Unix lets me solve those problems on spec, on time, and under budget... NT doesn't.

      • How do these things relate to Linux? No one's arguing that it isn't a good development environment, but perl runs in Win32 fairly easily.


        Have you tried to use perl on windows?
        It just isnt the same. Perl proggies typically make heavy use of syscalls such as "fork" and "pipe".

        Performance of these under windows is atrocious, not to mention that the whole windows filesystem/exec is shockingly low performance.
        (Its not designed to be used in the way perl programs typically use it)

        perl is seemingly perfect for linux, with its low forking overhead (comparable to creating a thread or lwp on other OSen) and its I/O subsytem performance.

        Programming, even in high level languages, is a totally different ballgame under windows, if you want performance. You have to do it differently.

  • by sczimme ( 603413 ) on Wednesday February 12, 2003 @08:09AM (#5286829)

    Read the RH press release here [redhat.com].
  • by TheMidget ( 512188 ) on Wednesday February 12, 2003 @08:09AM (#5286830)
    ... isn't that the same certification than the one we scoffed at [slashdot.org] when Windows 2000 got it?
    • by nemaispuke ( 624303 ) on Wednesday February 12, 2003 @08:23AM (#5286885)
      You are talking about two different things, Common Criteria is about security and Common Operating Environment is a military standard for mission critical applcations (Command and Control, Intelligence, etc). What it means is that if you use applications designed for Motif/CDE and use COE as a standard, they can run on RedHat Linux Advanced Server. This is more about functionality than security.
    • Actually no, this is a lesser certification. Linux has never achieved any security certifications of any kind while MS has starting with NT4. NT4 and W2K has also held this certification for some time. So, once again, linux playing catch up. Next thing you know various distributions will even try to match the look and feel of win-- opps, already happened.
      • ....security certifications of any kind while MS has starting with NT4.

        Are those only valid if NT is NOT connected to any network? Isn't that the only configuration that was certified? Or do I have some facts incorrect?
        • You are correct

          Micro$oft's marketroids have been making a Big Deal out of their C2 certification for years, but have never bothered to mention that their systems only pass C2 if they're not connected to a network, are in a locked room with armed guards outside the door, and are powered off.

          OK, just kidding about the last two criteria. But the part about not being connected to a network is no joke.

        • You are incorrect. At one time the first certification NT4 had was one that didn't involve being connected to a network. Current NT4 certifications are fully networked. Windows 2000 Server is also certified to a higher level with networking.
    • Of course this rating has no intrinsic value.

      It's simply a barrier to entry that has to be dealt with. This only means that there is one less bullsh*t excuse for someone to not use Linux.
  • Egads I'm afraid one our WMDs will be shot at DC!


    This program has commited a General Protection Fault and will fire ICBMs at DC. If the problem persists, quit calling Microsoft a monopoly.

  • Not seeing it. (Score:2, Informative)

    by smcdow ( 114828 )
    I've been tracking the status of COE compliance for Linux for a while -- I have several projects in the works that would benefit greatly from an "official" designation of COE compliance for Linux from DISA.

    I can find only one relevant page [disa.mil] on DISA that pertains to Linux/COE. This page has a link to a draft of COE Compliance Critera for Linux. The information on this page hasn't changed in several months, AFAICT.

    So, what's new here? Can anyone point me to a place on DISA that substantiates the claims made by the news.com article? Where is the "real", final COE Compiance Critera for Linux?

    • Not only this, but COE has something called a segment that allows pieces of software to be installed in the base OS in a common manner. My question is where are the COE segments for pieces of software like Oracle 9i (to be a segment in late March I hear, but still no mention of a Linux segment) and others that run under Linux?! If your project wants to be COE compliant (and there are various levels, but we are shooting for 5), you need to use COE compliant software. What good is an OS if you can't run some common apps on it?!

      If we could find segments like that, we could actually consider running Linux in our project. Until then, it will have to be Solaris and a 280R.....
      • COE Segments (Score:3, Informative)

        by zaytar ( 139318 )
        Disclaimer - I work for the DoD but i don't speak for them.

        "Segments" are basically customized software installs for COE. This includes Government produced software (Government Off the Shelf, GOTS) and commercial software (Commercial Off the Shelf, COTS). For instance there is a "segment" that installs Netscape.

        These segment installs basically install the software such that it conforms to the COE environment. For example, applications must live in a certain path, follow a certain naming scheme, use certain environment variables to find things, only put user data in a certain place, etc, etc. Think "rpms" or FreeBSD packages - segments are just big tar balls with a standardized format and install scripts :)

        The segments are available via DISA to those programs that are developing COE software - you have to show proof of need and sponsorship (i.e. somebody has to pay somewhere along the way for you to have access). Basically if you are developing applications for the DoD, you can get them - we have to get them through a certain chain of command. I think vendors can get access, but you have to talk to the DISA folks about how that works.
  • by imag0 ( 605684 ) on Wednesday February 12, 2003 @08:16AM (#5286857) Homepage
    Here's a better link to story, sans linkspam:

    http://news.com.com/2102-1001-984202.html [com.com]

    COE? Here's the link to their homepage:

    http://diicoe.disa.mil/coe/ [disa.mil]

    Admins! Get your fucking heads out of your asses and check to see if something is linkspam before posting it. This isn't the first time. Someone is making money from the click through.
    Fuck them.
    • Sometimes I wish that Taco et al were as vicious as irc ops, you could use a kick you whiner.

    • This is a major achievement for linux, seeing that the only UNIX based system that is DII-COE compliant is solaris. however, anyone who has ever had to read the DII-COE compliance documentation knows that it is ambiguous and very hard to follow. it's easy enough to make any os installation noncompliant by adding in non-DII-COE approved software, or by accidently opening up a port or two on the system.
  • How to get it? (Score:3, Interesting)

    by haggar ( 72771 ) on Wednesday February 12, 2003 @08:21AM (#5286874) Homepage Journal
    RH Advanced Server has generated some ill-will in our company when we realized the only way to "have a peek" was to shell out 800 buxors. We did that, but the venom dented some people's enthusiasm.

    Is there a way to get the .iso image, under a non-commercial license of some sort? I mean, shit, even Solaris 9 is available for 20 bux as a non-commercial, and 100 bux for commercial license.
    • I was wondering this just the other day. I found a "developer's" version that costs under $100, but I can't find the damn page anymore! Grr.

      They also have AS running in the HP Test Drive site (http://www.testdrive.compaq.com/). Unfortunately, it appears that Test Drive registration is disabled until the end of this week.

      -fp
    • Re:How to get it? (Score:4, Informative)

      by Anonymous Coward on Wednesday February 12, 2003 @08:40AM (#5286954)
      Nonsense.

      Anyone can download it for free from Red Hat.

      You just don't get the support for free.

      Mirrors: http://www.redhat.com/download/mirror.html

      Check the "enterprise" directory.
      • Re:How to get it? (Score:3, Interesting)

        by fuzzyping1 ( 266783 )
        Are there any download sites with the binary RPMS? Everything I've seen is SRPM only.

        -fp
        • If you want to save money to evaluate this product, you need to build it yourself.

          RedHat is under no obligation to provide free binaries, just free source files.

          Hey, they even helped you a bit by providing SRPMS instead of Tar files.
      • SRPMs only (Score:1, Informative)

        by Anonymous Coward
        Anyone can download it for free from Red Hat.

        Have you actually tried this? There's nothing but source RPMs.

      • Anyone can download it for free from Red Hat.
        Can someone mod the parent down? As pointed out in several other replies to the parent, there are NO .iso files available for the Advanced Server. Just the SRPMs.
      • Re:How to get it? (Score:3, Informative)

        by Nohea ( 142708 )
        I compiled the SRPMS myself and installed. Not easy, but it worked.

        - Download
        - rebuild all the SRPMS on Red Hat Linux 7.2 (seemed to be the closest)
        - look at the errors from missing devel packages
        - install *-devel rpms
        - rebuild again
        - rpm -Fvh *.i386.rpm
        - rpm -ivh the redhat-release package

        No installer seemed to be included.

        Then repeat every time a patch SRPM is released!

        Maybe it's worth the $800.
    • by d3xt3r ( 527989 ) on Wednesday February 12, 2003 @09:07AM (#5287100)
      All the source is right there on Red Hat's FTP servers. Download it and build it for yourself.
    • Re:How to get it? (Score:3, Informative)

      by cowbutt ( 21077 )
      Not for gratis, but a US$60 download as the Advanced Server Developer Edition [redhat.com]

      --

  • This is great (Score:5, Insightful)

    by hackstraw ( 262471 ) on Wednesday February 12, 2003 @08:27AM (#5286901)
    And impressive considering the other certified OSes (Solaris, AIX, HPUX, and NT). I first used the Advanced Server a couple of months ago while evaluating some Itanium2s, and I was plesantly suprised. I really like RH's decision to make the Advanced Server their "Enterprise" class distro with about an 18 month release cycle. Makes my job easier (TM).

    I never thought I would say this, but I've gotten accustomed to using RH. I was a die hard Debian fan, and in philosophy still am. But when it comes to 3rd party support, and announcements like this, I have to say that RH is the distro right now, and probably will be for some time to come (at least in the US).

    For all of the advancements that RH has done for Linux, and in spite of itself, including RPM. I would like for them to get a better package system. Yes, I know theres the apt-rpm or whatever its called, but I'm talking something that already comes with the distro and works on all architectures supported by RH. Someday...
    • Re:This is great (Score:4, Interesting)

      by EvilTwinSkippy ( 112490 ) <yoda@NosPAM.etoyoc.com> on Wednesday February 12, 2003 @08:53AM (#5287035) Homepage Journal
      Ack. Short of passing around source tarballs and having them compiled on demand, I don't think an ideal package system exists for all platforms.

      That said, why DON'T we just package the source tarballs instead of the binaries? I mean, back in the day it took forever to compile something on a beat up old 486. But today I can build Tcl/Tk in a little under 7 minutes, and the Linux Kernel in 20 or so. As the machines get faster and the compilers get more efficient tracking the binaries is going to seem downright silly after a while.

      My US0.02

      • But then you get something like OpenOffice. I run Gentoo, started from Stage 1, so I've compiled everything, and when it came to OpenOffice, the compile took me 16 hours on a P3/1GHz.
        I have no problem with stuff being compiled from source, it's just that in some cases it's more time-efficient (OO, for example) to have a binary, and in others, it prevents having to worry about the subtle differences between systems that prevent code from compiling.
      • I've got 62 Alpha machines, 2 Intel 32bit, and 3 Itaniums on the way plus my laptop and this is only a 3 day a week job.

        You want me to compile what?

        I have a hacked version of PBS [openpbs.org], a stock version of Maui [supercluster.org], and a number of scientific libraries/applications that are compiled from source. I think thats enough :)
      • Re:This is great (Score:4, Insightful)

        by A Masquerade ( 23629 ) on Wednesday February 12, 2003 @10:14AM (#5287525)

        That said, why DON'T we just package the source tarballs instead of the binaries?

        Source doesn't fix the packaging problem - it just moves it around a little. You still have basically the same problem removing, replacing or upgrading a package with a source based package as you do with a binary

        The killer of this idea for me is that I produce service systems which are designed for a particular (set of) function(s). Part of the philosophy I use is that the systems have only the software I need on them - which makes them more secure (fewer packages to have security bugs, easier to audit). In the case of service boxes they do not have compilers or tool chains on them - don't need anyone fiddling with stuff, if you need to do fixes those are done on a development machine, moved to a test machine and then deployed. Adding a compiler, and the associated tool chain, and the (development - then run times are probably already there) libraries to make stuff build makes my package set much bigger and consequently increases the maintenance task.

        • by dmaxwell ( 43234 ) on Wednesday February 12, 2003 @10:37AM (#5287695)
          A more sane way to manage source packages on production boxes is to have a machine similar to the production boxes but with the developer toolchain installed.

          The production boxes will still use debs or rpms but the compilation boxes can easily use something like checkinstall to make packages. This won't work in a potpurri environment but it would be fine if there's lots of identical machines. You mentioned that you wanted only particular software on your machines. With source compilation, you can even specify that the software only have certain options compiled in.

          Since the dev toolchains are confined to a few boxes, maintaining those shouldn't be onerous either.
      • Short of passing around source tarballs and having them compiled on demand, I don't think an ideal package system exists for all platforms.
        One word: Gentoo [gentoo.org].
    • I too am a die-hard debian-ite who works with RedHat a good bit (it is the flavor du jour at work). Rpm is a little ugly compared to debs, but up2date combined with a RHN subscription (a whopping five bucks a month per system entitlement, and you can float that across as many systems as you want to the limit of your patience) takes a *lot* of the pain of RH package management away. Yeah, yeah, it costs a little bit of money, but it'll save you so much time as an admin it's worth it and it's a way to kick back a little to the people that put in hard work on the distro you use. (On that thought, it'd be interesting if debian set up some sort of subscription/donation system where you could set it up to donate say $5/month from your account like a subscription. Nobody's budget is hurt by that but spread across N many users that's a good, reliable revenue stream for the project that could be used to fund development, bandwidth costs, etc.)
  • http://tinfoilhat.shmoo.com/
  • ... this coming when we are nearing war with Iraq and simultaneous with the release of Command and Conquer Generals [slashdot.org]? Coincidence? I think not!
  • by Anonymous Coward on Wednesday February 12, 2003 @09:29AM (#5287223)
    There was a LOT of bureaucratic inertia standing in the way of this effort inside the DoD. In the office this little initiative started in within ESC, the push for this cost two program managers and one engineer their positions, with extra effort made to derail their careers. Another person had to keep his head down and toe the line for a long time. The replacement for the second program manager was frusterated and constrained and a little scared, having entered the arena of combat by stepping over the corpses of the previous two (figuratively).

    The efforts by DISA and Red Hat were started because the little program that those people worked on provided the customer for the product. Sure, there was a lot of "anecdotal" demand for Linux, but this was the first formal acquisition program that was committed to it. The guinea pig, so to speak.

    Let's give proper respect to RH (those involved know who he is) at Red Hat, who took that first call and pitched it to his management, even though it looked like all the risk was on Red Hat.
  • Well, you yankees spend $400 billion of your $79X Billion discretionary budget on Weapons and Military -- a few thousand to RHAT is zero for the US Military.
  • by Stalcair ( 116043 ) <stalcair.charter@net> on Wednesday February 12, 2003 @10:06AM (#5287465)
    here is a clue for those not "in the know." Government contracting is based upon a business model that does not factor in things like the quality, efficiency or effectiveness (actually works AND provides what the end user needs) of the product or service in question. What is more important than anything is the ability to schmooze your way to the top and bring in that business (on the contracting side) while on the government side what is important is that you equally schmooze your way up the ladder by repeatedly demonstrating an amazing lack of care, concern or knowledge about the impact of poor development practices, confusion of personal and professional relationships, buzz words versus any actual understanding of the systems (and the systems' objectives) or any sort of ethical concern for actually being a good steward of both the tax payer's money and of the warfighter.

    In a free market economy the consumer has the option of making choices based on any number of factors including price, quality, speed/efficiency, convenience, and just plain old personal taste. However, in any system that shuts out all but the most deep pocketed (and well connected personally) companies then you had better be willing to pay more for less. Furthermore if the weights of the value of a product, service or the company that renders it has moved from the above factors (price, quality, etc) to that of the prettiest proposals, the slick talkingest (reverting to my Yosemite Sam mode) company personnel and the prettiness of words and documents presented then you will inevitably end up with less quality. Competition has then moved completely to the realm of draft picks for the cheerleader squad. It doesn't matter if they do nothing but look pretty and say stupid repetitive cheers... hey! they look pretty.

    Bullshit artistry is _THE_ factor in government contracting, as a track record of proven quality does not factor in. Now to be fair, there is the SEI system in place (Systems Engineering and Integration) which mostly inherits from the ISO 9001 system. With five levels (1 - 5, no zero... 1 is granted to anyone whether they can find their ass with either hand or not) you have a criteria of process quality by which you can judge an organization. However, with all the money and obvious effort that went into creating and maintaining this system the Achilles heel is no different than in any other of the "best laid systems and plans" to date. That my friend is the factor of non-compliance to the very processes that define who is granted what level. In other words, they don't use it like it was intended thus rendering it as just another acronym. The ironic thing (but typical in entrenched bureaucracy) is that even though pretty much anyone will admit (if you ask them lightly in the break room over coffee) that the system is rather broken most of those will still puff up with pride (if contractor) if they are a talking head of an organization with higher than SEI Level 2 or will speak with awe and wonder (if government) of an organization with SEI Level 2 or higher.

    What I fail to understand is why some will defend this bastardization on the grounds that those organizations with an undeserved SEI level are "Working Towards it." Well, that is good... really, however that is illogical when you look at the fact that the SEI system is not a projection but a grant of current operational status. I somehow doubt that there would be much validity in being granted a good bill of health after being shot 10 times if it was based on the fact that the surgical staff would "Soon fix me up good." No, instead I should be labeled as "In Critical Condition" and any other status be viewed as such. (Hmmm, is THAT what STAT comes from... meaning right NOW? I sure don't know) Back to IT work, if I was the customer then I would not care one damn bit of a system in place that is not consistently applied. The minute it becomes acceptable practice to arbitrarily award the SEI Levels is the same instance that such levels loose their meaning.

    Now some might say (who lack working neurons) that this is exactly what happens with capitalist Evil Corporations (TM) yet in reality we see that it is the government itself that creates this system. If the government would place individuals in decision making roles that had both a sense of ethics as well as refined professionalism then you would find that requirements would soon show a dramatic shift towards the quality of the products and services rendered. Networked people are important, to that there is no question. Yet a professional organization will correctly view those connected personnel as one of the many factors involved in doing business. ("Professional" defined here not just as "they get paid to do X" but referring the the ethical and motivational set of standards and practices they employ) Some actually believe that without business developers sliming their way through the system, charming the customer and confusing them when they question bad quality, that there would be no business. Perhaps in some cases there would be less, but there have been entirely too many cases in history (large and small) that show that if there is a need on one end and a supplier on the other than things can work out just fine. The middle man is nothing more than a facilitator of this process... a catylist (sp) but since they themselves do not do any real work they are expendable in reality. Before them business happened at perhaps a slower rate. Without them business adapts. Without those providing the actual product and service than there is nothing to be made of the best of deals. Take out the bullshit artists in the government and soon you will find that their contractual counterparts will begin to vanish as well.

    On a different but very much related note: Has anyone ever done a study of the percentage of commercials split up by radio, television and print (including the net) that actually advertise the uniqueness of the product, its advantages over competitors and why you should buy it? Don't get me wrong, I LOVE those beer commercials usually. However when so many commercials have become little sitcoms or tools of the "arteest" then I really fail to see how I as a consumer am supposed to do anything but ignore them and focus on doing research (to include ratings). I rarely see any commercial that is useful however that could just be where I live.

  • If the Green Berets use Redhat as part of a war (borg like) body suit... will they still be the Green Beret? Or the Red Hats?
  • Please take a look at the RH-AS license. Tell me that it does not conflict with GPL, and don't be lying about it. I think it does. It specifically states that I have to buy another copy to put it on another machine. Isn't this against the gpl? I bought the software, it's mine to do with as i please as long as I give out copies of the source along with it?
    • by lal ( 29527 )
      IANAL, so take your own read of the EULA:

      http://www.redhat.com/licenses/rhlas_us.html [redhat.com]

      It looks like each copy of RHAS installs with proprietary client to the RedHat network. This client is not GPL. It is "RedHat Intellectual Property". That's apparently what's licensed.

    • The GPL has a "mere aggregation" clause, which basically states that you can distribute GPL code alongside proprietary code, without affecting the licensing of the proprietary code, as long as the proprietary programs are separate programs. The distributor has to make source available to all of the GPL components, but can apply traditional rules to the proprietary components.

      So no, the RH-AS license does not conflict, and Red Hat follows the GPL. The same is true of the other Linux distros that include proprietary components: supplying source to the GPL and LGPL components is all that is required, and you can forbid people from copying the proprietary components.

    • The GPL requires that we make GPL and GPL derivative source code available to recipients of the binaries. We do that, AND post the source on ftp for anyone to use, which we don't have to do for this or any other of our products which are posted on ftp. We feel we should adhere to the spirit of the GPL as much as to the letter.

      AS has a stack of support and services that require a fee for use, reality is that no one will stop you from building your own or installing on multiple machines. But you won't get full support, ,services, RHN, and in some cases ISV/IHV support.

      Only part of the value of AS lay in the bits.
  • Does anyone have information on other Linuxes on their way to COE certification? SuSE Enterprise? (Can't think of any other commercial "enterprise/advanced server" type distros...)
  • I don't get it. linux is great for desktop and hardware-oriented things (soundcards, pcmcia cards, etc).

    but for the most stable servers running free unix, how can you beat the BSDs? and with CVSup et al, you can be sure you're really really up to date and secure.

    I'm a linux user since the 1.x kernels and a freebsd user since maybe 2 yrs ago. these days I use linux on the desktop and bsd on my servers. so I know and love both for the right purpose.

    linux has name recognition, but for ultimate stability, I'm just not sure its the right choice here...

Programmers do it bit by bit.

Working...