Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Linux Software

Ask Slashdot: Heterogeneous Network Backups w/Linux? 124

drix asks: "Like many I'm running a Linux gateway between my home network and cable internet connection. Naturally, I'd like it if my Linux server, 4 Win 98 boxes, and iMac could do a nightly backup. The problem is I only have one tape drive, which is, of course, situated in the server. So my question is what software exists that, a) runs on Linux, b) exists in a client server form, where the server runs a "backup daemon" that each client connects to to backup its respective hard disk each night, and c) has clients available for Windows and optionally Mac? I guess the analogous NT ware would be Seagate Backup Exec, which runs on an NT server and polls NT workstations nightly for any changed files, and then uses delta file compression to zip those changes to the server's tape drive. I don't need anything that complicated, but I must have the basic ability to move the files."
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Heterogeneous Network Backups w/Linux?

Comments Filter:
  • by Anonymous Coward
    using CDR for backups is just a bad idea.. this is what tape is fore.. CD's make better archives.. but well let that go. You could tweek amanda to do this.. just change the taper to do mkisofs then send it to cdrecord.. the problem here is that your only going to put 650MB's per disk.. this usally isnt enough.. so you could "flush" the dumps Or you could buy serveral CD-R's and burn an array of disks.. but again this isnt very practical.. just get a tape drive.
  • by Anonymous Coward
    Burt also supports multiple OS's. It can support more OS's than Amanda can.
  • by Anonymous Coward
    From a security standpoint, you're asking for trouble by using the Internet gateway\proxy as a tape backup host.
  • by Anonymous Coward
    Its not as nice as CygWin, but check out MacMiNT at ftp://ftp.linuxppc.org /users/harry/MacMiNT_PowerPC_Cross.bin [linuxppc.org]. At least some basic support for shell scripting, makefiles, gcc, etc... Andrew Beyer
  • by Anonymous Coward on Thursday July 15, 1999 @02:58PM (#1800221)
    I cover several, the ones I liked are:
    Backups
    http://www.amnda.org/ - Amanda
    ftp://ftp.zn-gmbh.com/pub/linux/ - afbackup
    http://www.cs.wisc.edu/~jmelski/burt/ - Burt
    http://www.estinc.com/features.html - BRU
    http://www.estinc.com/qsdr.html - Quickstart
    http://www.unitrends.com/bp.html - Backup Professional
    http://www.unitrends.com/ctar.html - CTAR
    http://www.unitrends.com/ctarnet.html - CTAR:NET
    http://www.unitrends.com/pcpara.html - PC ParaChute
    Commercial:
    http://www.arkeia.com/ - Arkeia
    http://www.legato.com/Products/html/legato_netwo rker.html - Legato Networker Linux client
    http://feral.com/networker.html - Legato Networker server

    Now not all handle multiple OS's, etc. But of the freeones it was afbackup or amanda (or both) that did.

    good luck.

    http://www.seifried.org/lasg/

  • by Erich ( 151 )
    You can use smbmount for the Windows machines, possibly some sort of appletalk client (netatalk? I haven't used the mac tools) for the macintosh. And then you can just use tar or cpio...

    One think you should remember is that you may not want to do full backups on the machines that don't have the compatable tape drive -- you won't be able to load the data off of the server to the client without getting the OS installed even after a disk crash!

    What I'd do is use netatalk and samba to have the same (home, maybe) mounted on all the machines... that way you have a centralized point that has all your important data. Additional data could be left only on the server.

  • The grape vine has told me that Veritas software will be releasing netbackup client and then later server.. it's expensive, but it's the BEST backup system out there.. the only thing that I know of that comes close is HP Omni-back.. but no linux for that one. there may also be a version of NETbackup lite.. (seagate backup exec was bought by Vertias recently.. this will become lite)

  • Well for one thing, 9 GB of data can fit on a $14 12/24 gig DDS-3 tape.

    These tapes were made to be rewritten over 1,000 times, aren't prone to scratching, doesn't require swapping for only 9 gigs of data, and are very small.

    Not to mention tape is slightly faster writing than CD-RW's. While CD-RW's write at 300-600 KB/s, tapes can write at about 10-20x that.

    You pay a little more for a tape drive, $700-$900 for an drive depending on make/model and internal or external.

    --
  • There is an SMB networking client for the Mac named Dave. I don't believe that it is free (though I could be wrong), but I do know that you can at least mount smb volumes from a Mac. Can you share out folders? Well, I'm not to sure about that one.

    I have never used Appletalk, and I don't really know what it's about, but I assume that it is yet another file sharing system. Linux kernels have support for AppleTalk. Perhaps you could copy files from your Mac via the AppleTalk daemon.

    Good luck to you!
  • Posted by iamnarf:

    I heartily second the motion for Retrospect.

    I got a freebie copy with a DAT drive. It rules!

    Put the tape drive on your Mac, mount everything
    to be backed up on it, and Just Do It. It is Really Easy, and it works Very Well.
  • According to our backup server (and the number of tapes we go through), since version 6.5, Backup Exec has backed up with incremental backups. I have no experience with prior versions, so I can't comment on them.

    If you can't get it to work, that's not Backup Exec's fault.

  • Mount the filesystems of the systems being backed up and then simply archive to tape on the server side. The clients don't need any special software to connect to a ``backup daemon'' on the server. The problem then boils down to supporting the various filesystems.
  • I do the same thing with my users. Mind you, it's 4 servers and 20 clients, but the idea is the same.

    anything you want to keep is to be kept on the server.

    I fully reserve the right to come in on the weekend and change your workstation around (bigger/faster/whatever). Anything on the local drive is temporary. Only two have learned the hard way. The rest all do as I say and keep important stuff on the server.

    If that makes me a BOFH, then I accept. But I'll be god damned if I'm gonna worry about each and every workstation we have and recuing some program or data that I have repeatedly told them to keep on the server if it was important!
  • Linux supports smbfs, so you can mount Windows drives alright. I'm clueless on the Apple side of things, but I've seen the term 'AppleTalk', so maybe Linux can mount Macintosh drives too. From there you could just a simple 'tar c /mnt/*' and problem solved, right?
  • Ummm, not wanting to incriminate myself in any way, I will say that what you are referring to is the Alias Manager, a good bit more powerful than simple path-based links. However, your statements regarding moving those specific applications between machines, post-install, isn't accurate... aliasing is used more to allow such behavior easily (moving files, maintaing the link, even across AppleTalk networks if available and with privs via Program Linking). And aliasing isn't maintained, strictly speaking, as a simple number, it is a delicate but determined waltz between file manager, alias manager, and the OS.

    In any case, most Mac-friendy apps do use aliases for many reasons, both for locating resources like libraries, the preferences folder, other applications, etc. They dont store a number per se (least not any more than any other binary data is really a number) but rather an alias object, easily created with the current path/reference of the target object (aliases are for more than just files).

    Just so you know... and if Macs really operated that way I wouldn't think it was too nifty. In fact I love the way I can move installed apps between volumes or machines and NOT have to reinstall (at worst, on a different machine, you normally have to move only a single pref file or folder, except MS products which feature dozens of libraries and folders).

    Also, I dont know what you think is funky about Mac pathnames, it simply uses ":" as a delimiter instead of "/"... the leftmost item is the volume name, just like most other OSes, and the rest is the path, again like other OSes. All thats different is the delim.

    Just so we don't have any misunderstandings...

    Bb.
  • Arkeia is cool, but it can't cascade drives which would be a major PITA (for me anyway).

    i don't know if you know this or not, but Seagate Backup Exec is now Veritas Backup Exec and last i heard, a new linux client is included w/ the latest binaries. of course, you're still stuck w/ NT as your manager but maybe that will change if ppl comment to Veritas enough.

    #define cascade, linking multiple drives into one pool that make the pool look like one giant tape drive.


    -l
  • Granted, Legato is not open source or anything. But it is a pretty nice piece of backup software, and supports backing up almost everything out there to a single tape. I've been running the linux client for months, and have zero problems - it just works.


  • A nice package, albeit a bit "foreign" when it comes to their terminology.

    You don't know the half of it. Arkeia was written in France and then translated for the US market. I found one place in their PDF manuals where one of the tables is still in French. Elsewhere in the docs, there are awkward bits of phrasing and grammar errors.

    Still, the package does work well, and it's cheap. For home use it's free, but you're limited to 2 clients. But for about $150, you can get a 5-client license which works nicely.

    We use this package at work every day. It's not perfect, but it beats the hell out of installing a tape drive in every box.
  • You can run Samba and use it to talk to your Windows boxes, and pull files that way.

    Sorry, that won't work except for the simplest setups. (I.e. where you don't want complete backups, but rather just a selected subtree or two.)

    The problem is in file locking. Many files (the registry and paging files are a prime example) are locked all the time. Typical backup programs like tar will just sit there patiently, forever waiting for the files to be unlocked when they come across them.

    This is, of course, a result of Windows sucking, but bitching about it won't change the fact that sometimes you've just gotta back up a Windows box.

    Elsewhere people have suggested Arkeia. I second this, as it not only handles true network backups, it does incrementals, keeps an off-tape log of files backed up so it's very easy to restore things, and it handles the registry on a Windows box appropriately.

    It has other nice features, too: mutltiflows, so it's backing up more than one machine at once, for efficiency; separate GUI and backup processes, so you can start it and then shut down the GUI; a Java GUI that will run anywhere; and it's cheap.
  • If you take the route of Samba, how can you backup the so-crucial registry files of Windows?

    You don't. This is one of many reasons why specialized network backup programs exist: they know to handle the registry and pagefiles a special cases.
  • Arkeia won't do the mac I think but will be fine for the windoze machines. I use this software to backup my network and it's a great solution.
  • Where can I download a Linux client?? ( I forget which version of NB I'm using).
  • I don't know too much about this subject but I will be following along. My network is much like yours, except the server is my G3 Mac. I know... building Linux PC for that task so I don't have to reboot my Win98/Linux box every now and then.

    Unfortunately for the Mac there is no free UNIX-like shell, like CygWin on the PC, so you miss some scripting tools.

    Have you set up Netatalk and the piece that lets Linux talk back to the Mac? You might be able to use Retrospect on the Mac, then AppleScript midnight copies to the "Appleshare" running on Linux.

    I odn't have a solution, but maybe some of this helps. Cheers,

  • This is simply a URL-enabled version of the informative posting, all URL's verified and typo-checked. :-)

    Freeware

    http://www.amanda.org/ [amanda.org] - Amanda

    ftp://ftp.zn-gmbh.com/pub/linux/ [zn-gmbh.com] - afbackup

    http://www.cs.wisc.edu/~jmelski/burt/ [wisc.edu] - Burt

    http://www.estinc.com/features.html [estinc.com] - BRU

    http://www.estinc.com/qsdr.html [estinc.com] - Quickstart

    Commercial

    http://www.unitrends.com/bp.html [unitrends.com] - Backup Professional

    http://www.unitrends.com/ctar.html [unitrends.com] - CTAR

    http://www.unitrends.com/ctarnet.html [unitrends.com] - CTAR:NET

    http://www.unitrends.com/pcpara.html [unitrends.com] - PC ParaChute

    http://www.arkeia.com/ [arkeia.com] - Arkeia

    http://www.legato.com/Pro ducts/html/legato_networker.html [legato.com] - Legato Networker Linux client

    http://feral.com/networker.html [feral.com] - Legato Networker server

  • Get Retrospect Express v.4.1 for the iMac. It is capable of scheduled backups and what not. It can even turn the machine on, back it up, and shut it down.

    And it will backup to an FTP server too.. so just create a MacBackup directory on the Linux box, and have the iMac do it's backups to that directory, then just back those files up to tape from the Linux box locally.

    I'm pretty sure it will do the same for Windows, and it's pretty cheap (Cheaper than BackUp Exec.)

  • I disagree. We use Amanda to backup a mixed Solaris/Linux/Win95/NT network, and it works flawlessly. I've performed restores to Solaris, Linux and Win95 without problems.

    Oh yeah, we use a 12-cartridge 4mm stacker, a single 4mm drive, and an old Conner "floppy tape" on 3 different backup servers...

    Richard.
  • Please tell us your url when it is ready.

    Thanks.
  • I *highly recommend Ghost [ghost.com] by Symantec. It is a very nifty program. (It was written by Binary Research LTD and sold to Symantec).
  • Well, since we're on the topic of backups.. does anybody know of a backup program that supports multi-volume CDRW backups and backup levels?

    Linux seems sadly lacking in the area of backups - it's all based around backup-to-tape ... CD's are an unexplored area for unix in general...



    --
  • Legato networker will do what you describe, but it's pretty darned expensive. It isn't meant for small installations, though. I don't know what I'd do without it, though. Check it out at www.legato.com
  • Well, I've seen a few of my thoughts already said earlier, but I figured I'd tell you what I've been doing. I have a dedicated server that does nothing but backup. Over 400 gigs go through it to a 15 tape DLT library per week. We have ~60 workstations and 4 servers.

    My policy on it irritates people, hence the title of my post. With the number of workstations I have to manage, I consider workstations expendable. If the Win9x/Winnt4 station eats itself, oh well... reinstall and get on with it. I tell everyone to keep anything they care about ON THE SERVER. Some dont listen, and they lose stuff when Windows blows up completely.

  • I hit TAB/SPACE and it submitted :P. So I suck. Anyway what I was going to say was....


    My setup is this: On my dedicated backup server I have NFS mounts to the main Linux servers. I run BRU on a crontab that does a incremental backup daily and a full backup weekly. I used a rather ugly hack on the MTX driver for the DLT library control, but it works very well. I chose BRU because of it's verify pass on backups. Makes me feel better about the amazing amount of data I've been asked to manage. For the database, I just dump the contents of what it backed up into different /var/log files. Cat and grep let me find what is where.

    My point is that if you keep everything on servers, locating, backing up, and restoring is easier.

    However, if you must keep stuff on the Windows machines, smbmount should work fine allowing you to keep everything mounted in one place. Someone already said this, but for the Mac, use one of the utils that lets you use SMB and "share" the directories.

    One last note, if you are doing this over the internet, I would hope you're using a vpn of some sort :) If not, please do! Encryption is our friend, even if El Reno says it's evil.
  • 1)My network is 100 base switched. My servers have hardware RAID 5. My setup is three times the speed of a hard drive on a workstation. There is no delay for saving to the network. None.

    2)I dont back up my workstation. I keep my stuff on the server.

    3)I have drop in replacements sitting in the back. The customizing for the individual user takes maybe an hour. Restoring from tape takes FOREVER. Besides, the user can shift over to another workstation while I'm setting up Outlook and the other stuff. Big deal.

    4)Backing up all the workstations would make my already enormous backup even bigger. This costs more money in tapes, takes longer, and is harder to sift through when there is a problem.

    5)I dont know if you noticed, but I have not been using the term "we". Guess why! Because I'm the only one who does internal support. With it being only me, I _need_ everything centralized.
  • I've got a setup at my work where the computer with the tape drive (FreeBSD) FTPs to any machine on our network, downloads files, then tars the downloaded files onto the tape. All done within a couple easy-to-understand shell scripts.

    It's probably not the fastest way to do backups, but it's simple, fast to implement, and easy to extend.
  • I just went thru 4 days of compiling/recompiling kernels to get my new CD-RW working under Linux because the braindead software that came with it refused to burn a "*.tar.gz" file. It complained that there were too many ".'s" and insisted on converting the first one to a "_"!

    I perused MANY HOWTO's/Sites/dejanews posts before I finally got mine working (and I've been using Linux/compiling kernels since 1994!) I've had similar problems getting my Hauppage TV card, printer, and Diamond Stealth II G460 video card going under 2.2.x. I'm currently roughing out a series of webpages under a working title of "It Doesn't Have to be This Hard" that will, hopefully collect all the info I needed to get these "odd duck" pieces of hardware going into one place. A lot of it was painfully obvious one I figured it out.

    I'm starting with the CD-RW... email me if you think I can help. (Am I gonna need a new ISP tomorrow?)

    davewalker@zebra.net
  • Can someone please explain the advantage, if any, of paying for backup software for computers that are on the same network? Being that you can have the linux box mount almost any filesystem in existance, even in read-only mode, and then tar that file system to a tape or automagically burn to a cd or even just make the iso filesystem image.. Someone please tell me WHY there is ANY market for software to let you do this under linux????? Are there any advantages to it, because I sure don't see any..
  • For the Win95/98/NT at least. First, share your Widows HD's in ro mode. Use smbmount to mount the Windows volumes (actually the whole HD) across the network and then use any Linux backup utility you like. Better yet, I believe the automounter can handle SMB mounts - setup the automounter so that

    /mnt/smb/win01/c
    /mnt/smb/win02/c
    ...

    get mounted to //win01/c, //win02/c etc. Then all you have to do is setup your backup tool or cron to backup /mnt/smb/*

    I don't know about the Mac - but if you can find a way to mount a Mac volume from Linux your set.

    If autofs cannot handle SMB mounts - just write a perl script that will parse a text file with this information and have your backup tool/cron run this script first.

  • So where is the GPL'd Ghost for Linux, anyway? There should be a way to do what Ghost does in linux. Imaging the disks is excellent, provided that the restore function can resize partitions and volumes, and there's some way to peer into image files and grab individual files.

  • It's better to use the linux box as a real server, then configure the other machines to store *important* files on the server. Don't back up the clients, just the server. Since you have to re-install windoze every 10 minutes, this is nice too becuase you can trash the windows box, re-install everything and still have all your important files accessible. Restoring backups on windoze is a nightmare (without something like arcada (even then, I bet it's a mini-nightmare)) because of their brain-dead long filenames.

    -=Julian=-

  • the ssh family of commands is designed as a drop-in-replacement for the r* family commands. If you use ssh, you can still do this.
    -earl
  • Arkeia from Knox Software will do this.
    Alternately you could use BRU and do the same thing (via scripts and shares...)

    HTH
  • A client already exists.. It was released with Netbackup 3.2.. As far as a server, I'll leave that to the rumor mill yet..

    Backup Exec isn't going anywhere for awhile either.

    Netbackup is a cool product.. yes spendy, but it does everything under the sun.


    (yes, I work for veritas, so I may be biased)
  • Take a couple of hundred windows NT/95/98 server/workstation, Macintosh, linux, and solaris boxes all thrown together. Using tar and cpio REALLY SUCKS when you need to restore one file from your 350 GB tape changer. Especially when you have to manually figure out where the file is supposed to be.
    Backup clients DO help, and make work easier.

    Besides, why? Because, it exists for other unix platforms.
  • Do you mean to say that BRU isn't commercial, and that there is a free version?
  • We just purchased licenses of this at work. Wonderful product IMHO. We managed to backup about 20Mb/s over 10BaseT network thanks to it's compression.

    Very cool, and it supports the nifty tape library we just purchased too.
  • The registry is kept in two files: %WINDOWS%\user.dat and %WINDOWS%\system.dat. These files are hidden by default. And you shouldn't need to bakup the pagefile, it should be as dispensable as %TEMPDIR%.
  • I agree.

    Retrospect is excellent, only problem is that the server only runs on Mac (a beta of the NT exist but is not usable yet.). We use it to back up about 100 macs and some pcs. We have done backups to tapes and to CD's. No problems.

    On my machines (some 10 unix machines) we only use some scripts that search for files that has changed since tha last backup, and then afio these files to the tape. Main problem is that we create to many files and the tape station is not reliable enough. We will change to CD or DVD soon (when someone has time to change the script)
  • Backup Exec does do incrementals... it just really sucks at doing restores...

    don't blame me... I didn't pick NT
  • Has anyone found a good package that supports AS/400? I'd like to recommend to my boss something Linux based that will backup our NT boxes and the AS/400 over the network...
  • Backup Exec does support those OS's but there is a problem with some of them. We bought it intending to use it at least partially to backup an AIX system. Unfortunately the client s/w is only supported on version 4.1 of AIX, we can get it to work on 4.2 Ok but it refuses to do so under 4.3.2. There has been a request for an upgrade to this in for about six months!
    The moral? Check the support documentation for supported OS versions.

    Graduate of the Mad Max school of defensive driving.
  • Look into VERITAS NetBackup. It's better than Legato and doesn't corrupt its catalog. ADSM is not in the same league (it is limited and awkward).
  • Oops... actually, the config file *is* protected with mode user read-only.... my bad. Sorry Veritas guys.
  • Actually, if you're talking about a Linux "agent" for Veritas (formerly Seagate) Backup Exec, it exists currently. I've just finished installing it on 5 Linux machines we have here. It's (obviously) not open source.

    To do this I hadx to buy the 7.3 version of Backup Exec for NT... (I upgraded from 7.0... it was like $400 for the upgrade). On the CD is a Linux "agent". It installs pretty nicely if you have Red Hat 5.2... otherwise gives you a warning about "unsupported UNIX platform" but worked OK on RH 5.0 and RH 6.0 (we don't have any other distributions, call me a follower... my guess is that it wont be pleasant to get it to work under another distro) It so far is also a real pig... Linux clients get backed up to the NT BE server at about 4.8MB/min (what is that? 500kbps?). In any case, 4.8MB/min across an Ethernet during a backup is not good. By way of comparison, our Windows clients get backed up at about 170MB/min.

    Additionall problems: the agent requires a password for access by the backup server... you specify the password during the install of the agent on the Linux machines.

    The password is:

    1. shown in the clear as you type it in (the install is just a batch script, guess they didn't know how to change the tty settings during the password entry).
    2. stored in the clear in a config file
    3. exposed to all users, as the directory the agent install creates and the agent config file is permitted rx for other. (i changed it, of course, to be unreadable by anyone but root)

    three things that tip you off that the guys in this division of Veritas dont write much UNIX software:

    1. the speed
    2. the default security problems i mentioned
    3. the agents come in a tar file that, when untarred,expands files into the current directory instead of creating a directory from your current one (not a big deal, but its a tipoff that they're used to PKZIP).
    4. the post-agent-install tip that you can start the daemon by "restarting your workstation". so funny. not just "/etc/rc.d/init.d/agent.init start", but "restart your workstation". it's classic.

    In any case, I'm glad the agent exists even with its faults because it makes my life much easier.. rather than having to administer 2 backup systems (one for NT, one for *NIX), I can do it all from one console... Of course, I don't know if it actually completely *works* yet, as the backup is running as we speak, but hopefully the verify results work out...

    Bottom line observation: Commercial software running on Linux is sort of scary. None of the niceties of community-produced stuff. I dont think I'd *want* to see the code...
  • Another drawback of using Seagate Backup Exec is that it can't do incrementals. It just can't. My boss should have kicked my ass for recommending it.

    Try it...

  • Of course, I meant of UNIX systems. It doesn't.
    I've been all over it up and down.

    It will do either a "NORMAL" (full) or a "DAILY" which is only files changed in the current day, meaning between the start time and midnight the previous day. All the other "backup types" depend on the windows archive bit and so will always perform fulls. Try it...
  • on Sunsite/Metalab, (and many other sites) there's a package called MacDump. I looked into it, but never actually ran it. I believe it's a simple daemon-like process on the mac that allows a unix client to suck information from the harddrive for backup purposes.

    Not that this is necessarily the best solution out there, but I mention it for completeness.
  • I have almost the same setup - a Linux server, another Linux machine, 2 Windows NT boxes (blame my wife), and a Mac.

    I bought the personal edition of Retrospect for the Mac, and backup via FTP to the Linux server. Hard disks are cheap enough for this to be the easiest solution for me (I use an old slow one for the backup).

    For the Windows machines, the Linux box mounts their disks with smbmount.

    Then I can run BRU Personal Edition (came with RedHat) on the Linux machine to back all this up.

    When I got a second Linux box (firewall) I NFSed that up to the main server, and bought the normal version of BRU which will do NFS drives.

    There are probably better ways to do all this, but this method works well enough for me, and I'd rather spend my time messing with other things :-)


    Simon
  • There is an unsupported Linux OmniBack client (has been for more than a year). I don't know if it is publically available.
  • by coyote-san ( 38515 ) on Thursday July 15, 1999 @07:08PM (#1800280)
    One approach is to redefine the problem. This approach will take some time, but you may find it worthwhile.

    Instead of asking how your Linux server can back up Windows and Mac clients, why not ask how much you can move from your Windows and Mac systems to the Linux server?!

    After my Windows system crashed yet again, I reinstalled the system (which, thanks to Toshiba, formats the disk so I lose any files which survived the crash) I set it up to use a network login - my "profile" and personal files are stored on my Linux system in an ext2 partition... and are backed up nightly. Likewise, I reinstalled all of my applications to a SAMBA "network" drive. I then changed the permissions so most of the files were read-only - no more Word viruses.

    This isn't perfect, but I'm a lot more comfortable with my Window system mounting network drives from my Linux box than my Linux box trying to SMBMOUNT the Windows system for backups.

    P.S., I use Amanda.
  • Maybe you should do what they pay you the big bucks for, instead of cry-babying about how your job is going to be harder. Because YOU dont want to have YOUR network-administrator job become harder, which is what they pay you for in the first place, you are having everyone in the company who works on documents experience tripled delays for disk access to their documents? I think the needs of the many outweigh the needs of the few. And backing up only YOUR workstations is a load of shit and that proves it. It shows that you realize as well as everyone else that it is very preferable to have the customizations/etc backed up, rather than dicking around on company time restoring them and "getting back up to speed" once they are lost.

    PS: I'm probably a better network admin than an employee, although my employee-bias probably makes me appear otherwise.
  • I know that r* daemons ought to be disabled as soon as you install your OS, but one of the nicest uses for them is:

    tar cf - yourDir | rsh remoteHost cd /dropZone \; tar xvBpf

    or tar it to /dev/yourTapeDrive. I did this for my *nix machines.

    As for desktop OSs: Servers get backed up. Desktops don't. Clients pull files. Servers serve them. Therefore clients keep files on servers or risk being LARTed hard.

    --jpg
  • Here's what we do to handle back-ups for an entire LAN of mixed machine types. First, we have a SCSI DAT II drive hooked up to the Linux box and a cron task that runs nightly and backs up all the contents of a single directory (/home/shared in our case.)

    We also have samba and netatalk installed on this machine so both Macs and PCs can mount the shared volume as a network volume. We then use platform-specific back-up tools (Retrospect from Dantz in this case for both machine types, but you can mix and match)on each machine to write back-up archives to the shared Linux volume.

    The cron task fires up tar at 4 AM and streams the whole pile of archives off to tape. First person in in the morning swaps tapes.

    One advantage of using Retrospect is that it can perform Mac and PC back-ups via FTP, without the need to mount a shared volume.

    C.
  • I have to speak up for Amanda - it's good for you. - http://www.amanda.org/ [amanda.org]
  • well... if your quest for commercial alternatives fails, what i would do is set up ftpd on your linux box, then either cook up some little java clients that do exactly what you want when you want them to, or find some kind of scriptable ftp client. if you roll your own, you can easily ftp what you want to your linux server, and then have the server tar it up and roll it off onto tape.

    alternatively there are NFS classes form sun, so you could get your mac and wins to nfs mount via java and move the files that way.
  • Just one word:

    Cycles.

    Tapes (even cheap DDS-tapes) can take several tens of thousands of read-write cycles (DLT tapes have the most: >1 million cycles)

    AFAIK, CD-RW have much less cycles.

    Oh, and another word:

    Lifetime.

    CD-R's become a pile of dirt after a few years in bad conditions (non-constant heat and humidity) and need constant climate and darkness to last decades.
    i dont konw the numbers for CD-RW (since im not interrested in them), but i would guess they last even less than CDR's.

    Tapes last for decades.

    my 2 cents (euro)

    --
    Jor
    --

  • I like my setup:
    All my data is on my Server. I a workstation goes down I don't care - I have not lost anything. I backup my server.

    I do fiddle around a lot with my PC's. I like to try out different OS's, different Apps.

    I create base OS installs with all updates and current drivers, then I use Ghost or DriveImage Pro to create images of these base OS installs on my server. That way I can restore them whenever I want. With compression, even an NT installation takes under 150Meg on mt server.

    Another trick for rapid restores: My primary workstation runs NT (don't flame me - I am already in a world of pain!!!!). I have a 1 Gig boot/system partition, and a 7 Gig partition. I frequently backup the 1 Gig partition that holds the OS by Ghosting it to my server. I install apps to the D: drive. If my system crashes, I restore the C: drive, and I am OK - that restores the OS, registry, and most system files. A nice rapid restore (not %100 fool proof, but very convenient.)

    Good luck.

    BTW: There is a free imaging app available at:
    http://cuiwww.unige.ch/info/pc/remote-boot/howto .html

    It is part of their great and imaging remote boot solution!
  • There is a free imaging app available at:
    http://cuiwww.unige.ch/info/pc/remote-boot/howto .html

    Last time I checked it is free for non-commercial use. I have tested it with Win9X clients. The image creation process is slow, but the restore speed was fine.

    The link has a terrific solution!

    Hope it helps!
  • We use this to back up 200GB every night to an exabyte 210.
    It has clients for most everything, server is similar.
    We run it on Linux.
    Don't remember a Mac client, however, use netatalk, and have the Mac copy itself to a volume on the Linux Box...then back up that volume.

  • I've been using smbtar to back up nine or ten Windows NT machines for about a year onto a DAT tape on the linux machine. I wrote some simple scripts to incremental backups every night and a full backup once a week. It has worked very well for us.

    smbtar comes with samba.

    I'm also running netatalk to share files on the linux box with a Macintosh, but I don't have a clue how to back up the Mac.
  • I has this problem in my workplace. Some WinNTWS machines needed to be 24x7 running (Oh my...!!) and must be replaced in minutes if(when) something goes wrong. I've made a Java Server that makes copies of some shares through sambatar to a Linux machine every X minutes. When we replace the HD of the failed machine with a fresh copy of an empty WinNT, we touch a "token" file in the shares of the PC and the server restores the information automatically with the latest Copy. It works fairly well.
    Ah!! I have another question, is possible to use CVS as a backup server? I think it could be amazing. Someone tried this?

  • Try Amanda [amanda.org]. It uses dump or tar on UNIX, and I think there may be some clients for Windoze. Also look for BURT on Freshmeat. That one looked interesting, requires TCL.
  • What we did was put our macintosh on the PC/Samba
    network with a product called "Dave Client". It
    is a Netbios client program for the Mac and lets
    you network it like a Windows PC.
    see http://www.thursby.com/ for details. We use
    smbtar (SAMBA tool for backups) on our Linux box
    to back up both the Windows PCs and Macintoshes
    that way.
  • IBM (I know I Know) Makes a product called ADSM It is a client/server enterprise backup solution. There are both the client and server for LiNUX, as well as clients for AIX, MAC, WinXX, OS/2 AS/400 etc...
  • I once attempted to back up a Mac to a Windows machine using an AppleTalk stack for W9x. If the Mac put files on the server directory, they remained intact, but if the PC moved them (to tape, for example) the resource forks got trashed. Does anyone know if Netatalk will allow the Unix system to move the files around without clobbering them?

    Jason

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...