Maybe we could have a challenge finding the base with best reducion factor upto 
base 100 000000? One probably do not want to store more vectors than that in 
memory, and i guess

Reducing composites in the integer field
http://jt.node365.se/composite.html


I've made my own kind of primality sieve that work by reducing lthe number 
field into composite "legs" and prime "egs". The legs that contain just 
composites in the numberfield will be thrown away sieved?

I asked the people at sci.math if there was a known upper limit for the 
reduction using this type of counters in the integer field, they said no.

I wonder if there is a name for this type of sieve/counter within information 
theory?

My idea is to find a base with very high percentage of composite legs, remove 
all start numbers that produce only composites in their legs, and then store 
the other egs that contain prime in an array.

That array will than serve the purpose modelling the new integer field, using 
the searched base with highest composite reduction.


The script seaching the bases take a few seconds, so there could be alot of 
improvements. I have a feeling that using bignumb bases as sieves will be out 
of range for us mere mortals, but who knows maybe for computer theorists.
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to