Project Eleven just awarded 1 BTC for "the largest quantum attack on ECC to date", a 17-bit elliptic curve key recovered on IBM Quantum hardware. Yuval Adam replaced the quantum computer with /dev/urandom. It still recovers the key.
> takes each shot's (j, k, r) and accepts d_cand = (r − j)·k⁻¹ mod n iff it passes the classical verifier
Judging by the fact the original code does more classical work than the prg solution, and in more practical terms, the fact it makes network calls, I'd say the quantum-integrated code is a lot slower for this set of problems.
Just to point it out this isn’t a jab at QC but rather a jab at project 11 and possibly the submission author, basically they failed to validate the submission properly and the code proves that the solution is classical.
Recovering a 17bit ecc key isn’t a challenge for current classical computers via brute force.
OK, so what I don't get is that from the GitHub page, it seems like that statement is purposely misleading. For the 17-bit key, the quantum computer correctly recovered the key in it's single run, while urandom used 2/5 runs. At 5 runs, I don't think one could say the quantum calculation is definitely better with any confidence, but the reverse should also be true; he hasn't actually proven that urandom performed at an equivalent rate to the quantum calculation. The only thing I can think of is if he is saying that the original group should have done more runs on the quantum computer to prove it. But from the framing he is using, seems like he is disingenuously declaring that the quantum computer is equivalent to a random number generator.
Dont they report an advantage based on simulating quantum effects every other year? I was promissed a quick way to decrypt my old harddrives decades ago, can we have that at some point before the sun burns out?
I expect they're just banking on getting their investment back with some fat returns by licensing it to the NSA to decrypt their hoovered up encrypted coms, with their data storage now reaching up to the yottabyte level. That's a lotta byte.
You know you're blowing your reputation when such claims are met by scientific articles with the headline, "Google claims 'quantum advantage' again." [1]
I think there is potential in it but it is absolutely going to become the next stock market slop after AI goes bust. You'll see everyone and their mom significantly overpaying for $10 billion random noise generators.
A 17 bit key has 131072 possibilities, which is trivially easy to brute force. Defeating it with a quantum computer is still very much a physics demonstration, and not at all attempting to be a useful computing task.
The point here is that the quantum computer component of the original solution is not doing anything - that the algorithm being run overall is not actually a quantum algorithm, but a classical probabilistic algorithm.
If the quantum computer were a key component of the solution, replacing it with an RNG would have either no longer yielded the right result, or at least would have taken longer to converge to the right result. Instead, the author shows that it runs exactly the same, proving all of the relevant logic was in the classical side and the QC was only contributing noise.
This was exactly the premise of my sigbovik April Fool's paper in 2025 [1]: for small numbers, Shor's algorithm succeeds quickly when fed random samples. And when your circuit is too long (given the error rate of the quantum computer), the quantum computer imitates a random number generator. So it's trivial to "do the right thing" and succeed for the wrong reason. It's one of the many things that make small factoring/ecdlp cases bad benchmarks for progress in quantum computing.
I warned the project11 people that this would happen. That they'd be awarding the bitcoin to whoever best obfuscated that the quantum computer was not contributing (likely including the submitter fooling themselves). I guess they didn't take it to heart.
You wrote that? Nice piece of work! Came here to post exactly this, that's the sigbovik paper in practice.
I'm still waiting for the Quantum Bogosort version of this "factorisation". For those not familiar with the algorithm, it relies on the many-worlds interpretation and is:
Shuffle the list randomly
If the list is sorted, stop
If it isn’t sorted, destroy the entire universe
Adaptation of this algorithm to factorisation is left as a homework exercise for the student.
Pasting my comment from the other article here - curious to understand the degree to which I'm understanding this.
----
The article itself is maddeningly vague on exactly what happened here.
At first blush, it looks like the quantum computer was just used to generate random noise? Which was then checked to see if it was the private key? Surely that can't be.
The github README [0] is quite extensive, and I'm not able to parse the particulars of all the sections myself without more research. One thing that caught my eye: "The key insight is that Shor's post-processing is robust to noise in a way that raw bitstring analysis is not."
"This result sits between the classical noise floor and the theoretical quantum advantage regime. At larger curve sizes where n >> shots, the noise baseline drops below 1% and any successful key recovery becomes strong evidence of quantum computation."
So... is one of the main assertions here simply that quantum noise fed into Shor's algorithm results in requiring meaningfully fewer "shots" (this is the word used in the README) to find the secret?
Someone help me understand all this. Unless I'm missing something big, I'm not sure I'm ready to call this an advancement toward Q-Day in any real-world sense.
Commit history looks vibe coded doesn't it? Don't read too much into anything you see there. It's what Claude or Codex wrote after being asked to solve the challenge.
"quantum grifting" has hit the cryptocurrency space brutally.
Scammers can take an old defunct coin or create a new one, buy up/create supply, strap ML-DSA on to it, and pump their shitcoin claiming it's quantum safe, then they can unload.
Eventually low information retail will get wise to this, I honestly don't know who this even works on right now.
They are missing the point though. The point is not even to be faster but to show that the QC is QCing. It can be slower than random search, and in fact might be expected to be. It’s kind of like early fusion plasma experiments that required vastly more energy than you got from fusion.
We are still doing science and engineering experiments, not making production anything.
QC relies on the observed output being statistically significant. This rebuttal is pointing out that Project Eleven only ran the algorithm once. At this point, there is no proof the IBM QC platform is generating anything statistically significant, especially more significant than the performance of feeding it /dev/urandom.
Basically, there is no proof this was real quantum computing instead of random noise picked up by the hardware inside the QC.
Now to show that the QC is doing anything against this rebuttal, they have it run it a significant number of times and show that it breaks the key a larger amount of times than feeding it a uniform distributed random noise source like /dev/urandom.
"dequantization" is a thing and it's a very legitimate part of quantum information research. It's useful to probe if something was truly quantum or just smokes and mirrors, because it helps us understand where the boundary between quantum and classical lies. Another dequantized result from the past days: https://arxiv.org/abs/2604.21908
The person who won the challenge with this apparently misleading code seems to have absolutely no quantum computing background. He writes this as background about himself:
> Technology leader with 10+ years in enterprise software, full-stack architecture, and cloud-native development. Background in computer science with hands-on experience across .NET, Python, Rust, and Cloud ecosystems. Currently working as Cloud GTM Specialist focused on solution architecture and sales engineering.
[OP] pigeons | a day ago
logicallee | a day ago
petterroea | a day ago
xienze | a day ago
derangedHorse | 20 hours ago
Judging by the fact the original code does more classical work than the prg solution, and in more practical terms, the fact it makes network calls, I'd say the quantum-integrated code is a lot slower for this set of problems.
src: https://github.com/GiancarloLelli/quantum/blob/7925f6ec5b57f...
dogma1138 | a day ago
Recovering a 17bit ecc key isn’t a challenge for current classical computers via brute force.
logicallee | a day ago
amoshebb | a day ago
__float | 23 hours ago
NewsaHackO | 18 hours ago
swiftcoder | 18 hours ago
He's not such a declaration - he is saying that the program is constructed in such a way that the quantum computer is irrelevant to the solution
PunchyHamster | a day ago
holografix | 9 hours ago
pseudohadamard | 5 hours ago
iberator | a day ago
weakened algorithms to the extreme (17 bits in 2026 LOL).
wasting_time | a day ago
https://blog.google/innovation-and-ai/technology/research/qu...
josefx | a day ago
IshKebab | a day ago
At least for breaking crypto, which seems to be its headline feature. Maybe there are other useful things it can do?
somenameforme | a day ago
mistercow | a day ago
PunchyHamster | a day ago
somenameforme | a day ago
[1] - https://www.nature.com/articles/d41586-025-03300-4
delfinom | 21 hours ago
dlcarrier | a day ago
arcfour | a day ago
If the results are statistically identical to guessing then it seems like you've just built a Rube Goldberg contraption.
tsimionescu | a day ago
If the quantum computer were a key component of the solution, replacing it with an RNG would have either no longer yielded the right result, or at least would have taken longer to converge to the right result. Instead, the author shows that it runs exactly the same, proving all of the relevant logic was in the classical side and the QC was only contributing noise.
nkrisc | a day ago
pseudohadamard | 5 hours ago
Strilanc | a day ago
I warned the project11 people that this would happen. That they'd be awarding the bitcoin to whoever best obfuscated that the quantum computer was not contributing (likely including the submitter fooling themselves). I guess they didn't take it to heart.
[1]: https://sigbovik.org/2025/proceedings.pdf#page=146
pseudohadamard | 5 hours ago
I'm still waiting for the Quantum Bogosort version of this "factorisation". For those not familiar with the algorithm, it relies on the many-worlds interpretation and is:
Adaptation of this algorithm to factorisation is left as a homework exercise for the student.jjcm | a day ago
NetOpWibby | a day ago
NooneAtAll3 | a day ago
perfection
functionmouse | a day ago
goodmythical | 22 hours ago
lynndotpy | 21 hours ago
throw1234567891 | 20 hours ago
pseudohadamard | 5 hours ago
You're blocking my periscope.
NooneAtAll3 | a day ago
oncallthrow | a day ago
jMyles | a day ago
----
The article itself is maddeningly vague on exactly what happened here.
At first blush, it looks like the quantum computer was just used to generate random noise? Which was then checked to see if it was the private key? Surely that can't be.
The github README [0] is quite extensive, and I'm not able to parse the particulars of all the sections myself without more research. One thing that caught my eye: "The key insight is that Shor's post-processing is robust to noise in a way that raw bitstring analysis is not."
"This result sits between the classical noise floor and the theoretical quantum advantage regime. At larger curve sizes where n >> shots, the noise baseline drops below 1% and any successful key recovery becomes strong evidence of quantum computation."
So... is one of the main assertions here simply that quantum noise fed into Shor's algorithm results in requiring meaningfully fewer "shots" (this is the word used in the README) to find the secret?
Someone help me understand all this. Unless I'm missing something big, I'm not sure I'm ready to call this an advancement toward Q-Day in any real-world sense.
0: https://github.com/GiancarloLelli/quantum
croemer | 16 hours ago
int32_64 | a day ago
Scammers can take an old defunct coin or create a new one, buy up/create supply, strap ML-DSA on to it, and pump their shitcoin claiming it's quantum safe, then they can unload.
Eventually low information retail will get wise to this, I honestly don't know who this even works on right now.
yieldcrv | 22 hours ago
api | 23 hours ago
We are still doing science and engineering experiments, not making production anything.
delfinom | 21 hours ago
QC relies on the observed output being statistically significant. This rebuttal is pointing out that Project Eleven only ran the algorithm once. At this point, there is no proof the IBM QC platform is generating anything statistically significant, especially more significant than the performance of feeding it /dev/urandom.
Basically, there is no proof this was real quantum computing instead of random noise picked up by the hardware inside the QC.
Now to show that the QC is doing anything against this rebuttal, they have it run it a significant number of times and show that it breaks the key a larger amount of times than feeding it a uniform distributed random noise source like /dev/urandom.
lkm0 | 20 hours ago
croemer | 16 hours ago
> Technology leader with 10+ years in enterprise software, full-stack architecture, and cloud-native development. Background in computer science with hands-on experience across .NET, Python, Rust, and Cloud ecosystems. Currently working as Cloud GTM Specialist focused on solution architecture and sales engineering.
Looking at the commit history, this looks vibe coded: https://github.com/GiancarloLelli/quantum
bhouston | 15 hours ago
pseudohadamard | 5 hours ago