On Sat, Mar 02, 2002 at 03:56:21PM -0600, Mike Hicks wrote:
> Thanks to Scot for his great presentation today.  

	I second that. many kudos to Scot for standing up there and giving
his presentation! that trick with ssh and dump, was just the thing I was
trying to figure out the day before. Good job Scot!

> I just figured I'd
> mention this, since a lot of people believe Amanda is overly complex.

	depends how you want to use it. If you're trying to make it into a
tool for packaged deployment on multiple systems (which is against the
maintainers' advice, they want you to compile it on each system), and
document how you do that; it's a different matter than just using it to back
up a couple of machines at home. (I was doing the former, you're probably
doing the latter).
	for better or for worse, I usually try to RTFM a lot, before dinking
with a piece of software... this can make me much slower to get things done,
than those people who just try it first and see what breaks, then RTFM from
there. I'm not going to say my way is better (indeed, it seems less
efficient at times); it just fits my non-reckless personality.
	for these reasons, I advised that it would take a week to figure out
amanda -- because that's what it took me. (I also realize that I'm not the
sharpest tool in the shed; so if you're smarter than me, you can probably
figure it all out faster).

> Perhaps you guys missed the link on the Amanda website pointing to an
> online chapter about it from "Unix Backup and Recovery" (one of the books
> Scot mentioned).
> 
>   http://www.backupcentral.com/amanda.html
> 
> It helped me quite a bit when I first opened up an amanda.conf and the
> reaction of, "What the--?!"

	yeah, that chapter is really an important doc to go by when learning
amanda; it has a lot of the compile-time options, documentation on file
permissions, and how to debug amanda. (file permissions will bite you on the
ass multiple times; and you'll spend a bunch of time staring at debug files
in /tmp/amanda, figuring out why a particular host isn't responding
properly).
	
	amanda was developed in a highly heterongenous university
environment, by old-school admins (not to derogate old-school admins... it's
just that some of them learned modern security priciples, and a lot of them
didn't), and it has been in development for many years; and all these things
show.
	the main problem I have with amanda is that *so much* of it is
hard-coded at compile time. there's a ton of compile-time options -- which
in most other software would be options in a config file (thing like what
your dump program is, whether tar is available, do you make debug files,
where the debug files get dumped to, where the index directory goes, where
the holding disk is, what ports it should listen on, what ports it should
try to send to, and dozens of others). every time you want to tweak one of
these configs, you have to recompile... and it's not a small program to
recompile. :(
	also, security was not handled in a modern manner. it uses at least
3 different port ranges for communication (both high ports and low ports);
and while in theory you can restrict the ports that it uses (a compile-time
option); it turns out to be such a PITA that trying to change the defaults
will mostly just break things. as such, it doesn't work worth squat through
firewalls. (let alone a NATing firewall). (and even if you think you have it
working for backups; try doing a restore... that uses a different set of
ports.) the only way we found to do backups through a firewall, was to set up
a VPN tunnel to the client to be backed up. the VPN also handles amanda's
lack of encryption. in theory amanda can be kerberized; but for those of us
without an existing kerberos infrastructure, fsck that noise...
	authentication to the client is done via an .rhosts-workalike (the
.amandahosts file); with the usual degree of non-security.
	also, amanda doesn't deal well with anything but tapes as backup
media. I have just learned that there is preliminary support for backup to
files, in amanda 2.4.3b2 (instead of specifying a tape device [eg.
/dev/nst0], you specify file:/path/to/dir/); but it's not very mature yet
(you need to either write a script to move files out of that dir, or else
set up multiple dirs and treat them as multiple tape drives, which is a pain
as well).

	on the upside, I will admit that amanda:
- deals well with remote systems. it's about the only free backup tool that
does, unless you write your own scripts, like the estimable Mr. Jenkins. :)
	- supports bandwidth-throttling, so you don't crush your network or
the client you're backing up.
	- supports lots of clients, from just about any UNIX out there. it's
very portable.
	- lots of options can be passed to/about the client; like how much
to compress the data, where to compress the data (on the client, or the
server), whether to index the files, whether to record the backup, and
whether to delay the backup until a more suitable time.
- makes nice indexes, and gives you a nice tool with which to do restores
remotely. (not the most intutive tool; but it works, and works better than
most other free tools).
- deals well with tape libraries and robots (very little else does).
- sends you nice reports every day, about how the backup last night went.

	so yes, if you want to backup several terabytes of data, on several
different machines, to a tape library, amanda is still likely the way to go.
:) its appropriateness for smaller systems, must be evaluated in light of
other tools, like Mondo (http://www.microwerks.net/~hugo/), or your own
home-brewed backup scripts (some should be available at
www.backupcentral.com).
	that said; yes, we use it at Real-Time, and a number of our customer
sites; and will likely use it for some time to come, because of the
advantages listed above.
	
Carl Soderstrom.
-- 
Network Engineer
Real-Time Enterprises
www.real-time.com