Den söndag 17 januari 2016 kl. 10:44:47 UTC+1 skrev jonas.t...@gmail.com:
> Maybe we could have a challenge finding the base with best reducion factor 
> upto base 100 000000? One probably do not want to store more vectors than 
> that in memory, and i guess
> 
> Reducing composites in the integer field
> http://jt.node365.se/composite.html
> 
> 
> I've made my own kind of primality sieve that work by reducing lthe number 
> field into composite "legs" and prime "egs". The legs that contain just 
> composites in the numberfield will be thrown away sieved?
> 
> I asked the people at sci.math if there was a known upper limit for the 
> reduction using this type of counters in the integer field, they said no.
> 
> I wonder if there is a name for this type of sieve/counter within information 
> theory?
> 
> My idea is to find a base with very high percentage of composite legs, remove 
> all start numbers that produce only composites in their legs, and then store 
> the other egs that contain prime in an array.
> 
> That array will than serve the purpose modelling the new integer field, using 
> the searched base with highest composite reduction.
> 
> 
> The script seaching the bases take a few seconds, so there could be alot of 
> improvements. I have a feeling that using bignumb bases as sieves will be out 
> of range for us mere mortals, but who knows maybe for computer theorists.

At least i am impressed how well it reduce the composite integer field but of 
course you need a big array...

The math guys said just construct the base using the primes 2*3*5*7*11*13*17... 
and so on.

NEWBASE = 510510 ------------------------------------------
BASE= 510510 Composite legs= 92.91551585669234% ===============
One could say it is a bit clumsy but i think it is neat be able to peel of 
numbers that do not even need to be considered for primality test.
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to