I may be having my wires a little crossed (I'm not an electrical engineer) but I was always under the impression that manipulation of the physical characteristics like that from heat/dampness didn't reduce the "speed" but the "quality" (like line noise/errors/etc) of the line.
Whether old telco lines or newer data lines it's all about electrical signal and bit error rates. More errors = more retransmissions = slower perceived throughput. Just my thinking. Scott Joe Greco wrote: >> http://www.wired.com/gadgets/miscellaneous/magazine/17-10/ts_burningquestion >> > > It used to be that we would notice this, except that it had everything to > do with temperature *and* dampness. In the '90's, it was still quite > common for a lot of older outside plant to be really only "voice grade" > and it wasn't unusual for copper to run all the way back to the CO, > through a variety of taps and splice points. Even though Ma Bell would > typically do a careful job handling their copper, the sheer number of > potential points of failure meant that it wasn't unusual for water to > infiltrate and penetrate. If I recall correctly, the worst was usually > a long, hard cold rain (hey we're in Wisconsin) after which people who > had been getting solidly high speed modem connects would see a somewhat > slower speed. > > ... JG >