Quantum Supremacy — Factoring Large Primes

Solitaire

Neoclinus blanchardi
Joined
Jul 25, 2001
Messages
3,075

Quantum Computing Supremacy Fight — Google Versus IBM by Chris Roberts


Multiply 1,048,589 by 1,048,601, and you’ll get 1,099,551,473,989.
Does this blow your mind? It should, maybe! That 13-digit prime number
is the largest-ever prime number to be factored by a quantum computer,
one of a series of quantum computing-related breakthroughs (or at least
claimed breakthroughs) achieved over the last few months of the decade.

An IBM computer factored this very large prime number about two months
after Google announced that it had achieved “quantum supremacy”—a clunky
term for the claim, disputed by its rivals including IBM as well as others, that
Google has a quantum machine that performed some math normal computers
simply cannot.


What would really blow my mind?
Factoring complex numbers like quaternions.


P. S. Obligatory math joke:

Q: How do you solve the Ramen Hypothesis?
A: Use your big noodle.
 
:confused: How can 1,099,551,473,989 be a prime number if 1,048,589 and 1,048,601 are two of its factors?

Good catch. The article should have said 1,099,551,473,989 is the largest product of two primes to have been factored by a quantum computer.

Being able to factor the product of two large primes in a reasonable amount of time would allow you to break the RSA cryptosystem and some similar cryptosystems. That is one of the main reasons why the US government is so interested in funding quantum computing.

For perspective: 1,099,551,473,989 is roughly 40 bits. These days, RSA keys are usually at least 1024 bits, or as many as 4096 bits.
 
Good catch. The article should have said 1,099,551,473,989 is the largest product of two primes to have been factored by a quantum computer.


I must be missing something
Code:
  i i i i i i i       ooooo    o        ooooooo   ooooo   ooooo
  I I I I I I I      8     8   8           8     8     o  8    8
  I  \ `+' /  I      8         8           8     8        8    8
   \  `-+-'  /       8         8           8      ooooo   8oooo
    `-__|__-'        8         8           8           8  8
        |            8     o   8           8     o     8  8
  ------+------       ooooo    8oooooo  ooo8ooo   ooooo   8

Welcome to GNU CLISP 2.48 (2009-07-28) <http://clisp.cons.org/>

Copyright (c) Bruno Haible, Michael Stoll 1992, 1993
Copyright (c) Bruno Haible, Marcus Daniels 1994-1997
Copyright (c) Bruno Haible, Pierpaolo Bernardi, Sam Steingold 1998
Copyright (c) Bruno Haible, Sam Steingold 1999-2000
Copyright (c) Sam Steingold, Bruno Haible 2001-2009

Type :h and hit Enter for context help.

[1]> (* 1048589 1048601)
1099551473989
[2]>
 
As I understand it, they went the other way.

Given 1099551473989, calculate its prime factors.
Even that doesn't require a quantum computer. To generate a list of primes from 2 to (say) 2,000,000 could be done on an ordinary PC in no time.

Using LISP, factorizing 1099551473989 is also a quick operation.
 
I can't read the original article, but I think the point is not that the feat of calculation was amazing, more that it had been done on a quantum computer, which have, up to now, been more theoretical than actual.
 
As I understand it, they went the other way.

Given 1099551473989, calculate its prime factors.
That's still not a significant problem as psionl0 is saying. In fact one of the two most obvious starting points for that search (the square of the target number) is right on top of the answer.
 
Last edited:
Even that doesn't require a quantum computer. To generate a list of primes from 2 to (say) 2,000,000 could be done on an ordinary PC in no time.

Using LISP, factorizing 1099551473989 is also a quick operation.

It's pointed out in the OP that this is only about a 40-bit number. Actual encryption uses 1024-bit numbers (or higher).

Even a slight speed gain over conventional computers at 40 bits can add up to a major advantage in cracking 1024-bit encryption. Something I understand LISP cannot do in practical amounts of time.

And even LISP is going to be slower than optimized machine code, which is going to be slower than embedded logic, which is the level at which quantum computers operate. So it's not really a good comparison.

Also, "a quick operation" in human terms is on the order of a few seconds. In terms of process-intensive computing, timeframes are actually in milliseconds or microseconds. Speed variances can add up to major gains over millions of operations, even if they aren't perceptible to humans in small-scale tests.
 
I don't think psion understands the problem.
Well, clue us in then. The problem cited in the OP isn't difficult. It requires much larger numbers to become difficult. I found the factors of that number before I'd finished the program to do the factoring.

ETA: Got to the article now. Pretty sure it's the author of the article that doesn't understand the problem. There are multiple mistakes indicating that.
 
Last edited:
Google says that its 54-qubit Sycamore processor was able to perform a calculation in 200 seconds that would have taken the world’s most powerful supercomputer 10,000 years. That would mean the calculation, which involved generated random numbers, is essentially impossible on a traditional, non-quantum computer.
Link
 
Note that IBM disputes Google's claim:

Unsurprisingly, IBM, the company that operates the supercomputer that Google claims to have beaten, and a key quantum computer competitor, is disputing Google’s claims. In a blog post published pre-emptively on Monday, the company said that the same task could be performed on a classical system in just 2.5 days, rather than the 10,000 years that Google is claiming. IBM says that Google “failed to fully account for plentiful disk storage” when estimating how long its traditional supercomputer would take to perform the calculation.
And for reference, the problem described in the OP took 40 milliseconds to solve on my desktop with a program written in under a minute.
 
Well, clue us in then. The problem cited in the OP isn't difficult. It requires much larger numbers to become difficult.

Yes. But if quantum computers can do 40 bit factoring now, there may not be much preventing them from doing 4096 bit (or more) factoring in the future.
 
Note that IBM disputes Google's claim:


And for reference, the problem described in the OP took 40 milliseconds to solve on my desktop with a program written in under a minute.

Yes, but that is NOT the problem that Google claims would take 10,000 years or IBM says they could do in 2.5 days. Both those numbers don't refer to how long it takes for a classical computer to do the factorization. Rather, they refer to how long it would take a classical computer to accurately simulate the quantum computer running the factorization. That requires calculating the time evolution of a very complex wave function, and it's much, much more difficult to do that than it is to just do the factorization directly.
 
Yes, but that is NOT the problem that Google claims would take 10,000 years or IBM says they could do in 2.5 days..
Yes, I get that. Still leaves me baffled as to what the actual advance is that the article in the OP seems to be mangling.

You sure you're correct about what Google's claim was?
 
Last edited:
Yes, I get that. Still leaves me baffled as to what the actual advance is that the article in the OP seems to be mangling.

You sure you're correct about what Google's claim was?

I believe the advance is that for the very first time a quantum computer has solved this (trivial) factorization problem. Sure, a capable desktop can do this in a few seconds.

But this is another step toward getting quantum computers to factor much larger numbers that are the sum of two primes, something they can do theoretically but not yet in practice. However, getting the prime number factors of very large numbers is something that modern desktops and even exaflop supercomputers can't do.

Modern public/private key encryption is built on very large numbers that are the sum of two large prime numbers. The underlying premise is that because even supercomputers can't take the sum and recover the two prime numbers that generated it, such encryption is unbreakable. As soon as we get quantum computers that can recover the initial primes, public/private key encryption suddenly becomes breakable.
 
What would really blow my mind?
Factoring complex numbers like quaternions.


P. S. Obligatory math joke:

Q: How do you solve the Ramen Hypothesis?
A: Use your big noodle.
It would impress me if people understood the difference between 'prime number' and 'prime factor'...
 
Being able to factor the product of two large primes in a reasonable amount of time would allow you to break the RSA cryptosystem and some similar cryptosystems. That is one of the main reasons why the US government is so interested in funding quantum computing.

For perspective: 1,099,551,473,989 is roughly 40 bits. These days, RSA keys are usually at least 1024 bits, or as many as 4096 bits.


Setec Astronomy.
 
Even that doesn't require a quantum computer. To generate a list of primes from 2 to (say) 2,000,000 could be done on an ordinary PC in no time.



Using LISP, factorizing 1099551473989 is also a quick operation.
I remember when it was a big deal for a quantum computer to find the prime factors of 21. The significance lies in being able to do it at all with a quantum computer. Some people were saying we would never reach 91 with a quantum computer.
 
Any company that could bust RSA encryption and hold patents on the next generation of quantum encryption would be able to charge for a lot more than just advertisements.
 
Busting RSA encryption wouldn't be the end of the world though, there are other encryption schemes based on for example elliptic curves for which no faster quantum algorithm exists. Sure, the world would have to move away from RSA encryption but it's not like alternatives aren't available.
 
Busting RSA encryption wouldn't be the end of the world though, there are other encryption schemes based on for example elliptic curves for which no faster quantum algorithm exists. Sure, the world would have to move away from RSA encryption but it's not like alternatives aren't available.

Minor correction, it appears that only some particular key-exchange algorithms based on elliptic curves are quantum-proof but the general elliptic curve based encryption schemes are equally vulnerable as RSA. There are still plenty of quantum-proof encryption schemes though.
 
Moving away from RSA would be a pain in the ass, though.

It's already in progress, though; at least for TLS. While RSA is still in use, it's a minority of the allowed ciphers in NIST's TLS guidelines now. Check NIST SP 800-52r2, Section 3.3.1 (https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-52r2.pdf). A LOT more ECDSA, DSA, DH, and ECDH as the recommendations now. A lot of the public certificate providers are issuing ECDHE certs now as well. RSA is still present, but that move is already starting.

I can't speak much outside TLS, though, so you may have more insight in that area (TLS is primarily what I deal with).
 

Back
Top Bottom