Hey guys,

Ok, this is a silly -- and somewhat newbie-ish -- question... but I have a 
huge headache and just can't think straight, and also I'm a bit paranoid 
(;


Ok. I have a home network. ALL the NICs can do 100M. they are all set to 
100/Full duplex. The switches all have the "100" light on, the ones that 
have "fdx" lights on have that on too.

Now, according to my headachy calculations, under ideal conditions I 
should be getting (100/8=12.5) 12.5 megabytes per second when I transfer 
data. Which means under non-ideal situations, like when I'm actually using 
a protocol and actually writing the data, maybe 7-10 megs per second.

I'm getting like, 800K/second, though, which according to my math seems 
like the non-ideal version of a 10baseT network rather than the 100baseT 
that I actually have.

Now, given that I'm tired, have a headache and am actually horrible at 
math even on a good day, it's not unlikely I've got something wrong here. 
So, anyone wanna tell that I'm wrong and that I don't need to rip the 
network apart tomorrow morning?


-Yaron

--