> i get the feeling that i will need to read the entire file as i used to read 
> it taking each record and doing the following:
> convert the string record to a bignum record
> convert the bignum record into a byte string
> write the byte string to a new data file
> 
> does that seem right?

nevermind. this is indeed what i needed to do. the new file is 438.4 mb. the 
time to read, hash, write is now 317 seconds. processing rate is 83818/sec. the 
hash still uses bignums. the speed change is just from reading and writing 
bytes instead of strings.

the drop in filesize is about 30% the gain is speed is about 15%

-- 
You received this message because you are subscribed to the Google Groups 
"Racket Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to racket-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to