I've gotten confused on this problem way too many times.. I'd like to get some definitive starting points.
When you see net adapters online they are always rated like 10/100 or 10/100/1000. So how does one turn that notation into megabytes? I think those numbers stand for bits, right? But still, when I'm trying to measure how much data is moving to a certain directory, and I want to compare it to what the adapter is supposed to do... (in some easy homeboy way). I vaguely remember something about 8 bits to a byte or maybe its the other way round... My homeboy transfer measurements: I measure the incoming MegaBytes as measured with `du' with a while loop interating in settable intervals. So in this case when set to 60 seconds,I now the number of megabytes that arrive in 60 seconds but would like to know how to convert that to the other notation. I'm seeing between 222 and 237 MB in a full minute being transferred and it seems quite slow for what is supposed to be a gigabyte network. This is just across two computers on my home lan, both with gigabyte adapters and they connect thru a gigabyte switch. Or I hope they are. My setup looks like this in brief (simplified). The transfer is between h4 and h5 (windows XP, windows 7) You'll note there is 10/100 router between the whole lan and the internet. Both of the subject machines are set to the 10/100 router as default route. The Gigabyte switch has no address. internet | | | (netgear router is lan `default route' <= 10/100***** NETGEAR ROUTER (inside address 192.168.0.20) | | | | | | (192.168.0.5) h1 | h3 (192.168.0.7) | | gigabyte switch | | | | (192.168.0.9) h4 h5 (192.168.0.17) So I guess I worked another whole subject into this but really I would like to know how to make the conversion mentioned. But also if I should be expecting h4 h5 to be able to use GigaByte transfer speeds.