Confusing observation: I am using a Pentium II 350 MHz with 128MB RAM, with windows98 for intensive algorithms (coherent integration)--aprox. 175 billion calculations per work unit. (The [EMAIL PROTECTED] project, a fascinating project itself.) On my system average CPU work time is over 40 hrs per unit. I have noticed that platforms using : i386-pc-linux-gnu-gnulibc2.1 are averaging 18 hrs per unit!
I know very little about operating systems. Question 1: How can linux help a CPU perform these calculations at twice the speed, or am I missing something here? Question 2: Considering my above statement concerning my ignorance of OS's, should I even be considering abandoning windows in favor of linux ? Any help greatly appreciated-- Art Brown

