Indeed-- I wasn't incredibly specific, and have been corrected by Bill
Stewart on this. According to the November issue of Scientific American (to
reference one source), Shor's Factoring Algorithm causes the resources
needed to factor a given number to rise polynomially, as opposed to
exponentially. Their example was that a 500-digit number, in classical
computing, would take 100 million times as many resources to factor as a 250
digit number. In the world of quantum computing, when attacked with Shor's
Algorithm, the 500 digit number only takes 8 times as many resources. As was
pointed out, I spoke too soon. This would only affect asymmetrical
algorithms, based on the difficulty of factoring large numbers. Hehe, but as
Leitl pointed out, this is all theoretical, so I'm kind of shooting blind on
this one.
~SAM

> From: David Howe <[EMAIL PROTECTED]>
> Date: Thu, 17 Oct 2002 13:45:01 +0100
> To: "Email List: Cypherpunks" <[EMAIL PROTECTED]>
> Subject: Re: One time pads
> 
> at Thursday, October 17, 2002 2:20 AM, Sam Ritchie
> <[EMAIL PROTECTED]> was seen to say:
>>     ACTUALLY, quantum computing does more than just halve the
>> effective key length. With classical computing, the resources
>> required to attack a given key grow exponentially with key length. (a
>> 128-bit key has 2^128 possibilities, 129 has 2^129, etc. etc. you all
>> know this...)     With quantum computing, however, the complexity of
>> an attack grows only polynomially.
> Is this actually true or is it that it can scale proportionally in time
> and in number of qbits required? if you assume that a classic machine
> takes x^2 operations to break a key, but a quantum machine will take x
> operations with x qbits, that would have the same effect, provided you
> can create that many qbits. I haven't seen any papers that say that it
> is polynomial at all though - can you provide a reference or two?

Reply via email to