TCLUG Archive
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: [TCLUG:607] Mixing Alphas and x86s



On Mon, Jul 13, 1998, whisper@bitstream.net wrote:
>  ... if more than one copy of a program is run, ...

Programs basically use 4 different memory areas .text (actual machine   
instructions), .data (the heap, memory allocated by malloc() and   
friends), .stack (memory for keeping track of function calls and local   
(automatic) variables, and .bss (global variables and such).   
 (disclaimer, I may have .data and .bss switched as far as which items of   
data are stored where.)  When multiple copies of a program are run, they   
can only share the .text segment.  All the rest is unique to each   
process.  This can and does reduce memory usage, but generally not as   
much as you would expect or like.  Most programs use a fairly small   
subset of their instructions and the rest get swapped out anyway.

>  ... how much ram?

For multiple user systems (single user too actually), you want to keep   
all the commonly used code and data in memory.  Once you start thrashing   
from disk to memory and back to disk, performance gets "bad" fast.  To   
figure out how much memory is needed, take the number of simultaneous   
users multiplied by the amount of memory used by the programs they will   
be using and then add an amount for the OS and the daemons you're   
running.  Then add in any additional large uses such as Oracle, Sybase,   
... .  Example:

20 x Netscape user x 10 MB. (1 MB for login shell, 9 MB for Netscape data   
and cache)
5 x gcc users x 4 MB (1 MB for login shell, 1 MB for gcc, 2 MB for emacs)
...
48 MB for OS and daemons
= 200 MB + 20 MB + 48 MB = 268 MB

It adds up in a hurry.  This is often more of a limit than CPU speed.   
 You can use top or another system monitor to see how much memory your   
applications actually use and then use those numbers for your estimate.   
 10 MB may be a bit high, but I've always heard that Netscape is a memory   
pig.  For this reason, you may want each of the machines to run Netscape   
locally since it's pretty easy to set up clients with 32 - 64 MB of RAM   
and then keep the main server at 128 - 256 MB for people doing shell   
access.  If you don't expect many logins and just want to run sendmail,   
apache, and such, you can probably get away with quite a bit less.  I've   
seen lightly used servers with just 16 MB that gave good response.  It   
all depends on how you want to use the system and how heavy the use will   
be.

For comparison, some major ERP vendors estimate usage of 6 - 10 MB per   
user on the server + 64 MB for the OS + a fairly large amount for a   
database (512 MB and up).  So seeing 50 user systems with more than 1 GB   
of RAM is not unusual.  The HP web site shows a system with 3 GB RAM, 8 x   
200 MHz Pentium Pro CPU's, and 36 SCSI disks (7200 RPM), as being able to   
support about 400 concurrent ERP users.



ERP - Enterprise Resource Planning - applications such as general ledger   
accounting, accounts payable, accounts receivable, inventory management,   
point-of-sale, human resources, payroll and benefits tracking, ....  Big   
names in the ERP business include SAP, Baan, PeopleSoft, Lawson, and J.D.   
Edwards.