Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The mass of Earth is about 6E24kg. The crust makes up about 1% of that, and silicon makes up about 28% of that. So about 1.68E22kg silicon is available on Earth. Assume we convert all of that to a giant computer, capable of operating at Bremermann's Limit[1]. That would give about 2.28E72 (quantum) operations/second. 2^255 / 2.28E72 ≈ 25400 seconds to count to 2^255. Figure a measly 100 operations to test each key, and you're looking at a month per key to brute-force. And that's ignoring light-speed communication delays between parts of the computer, which would dominate.

If it looks like someone is going to build a quantum computer out of the entire mass of the silicon in Earth's crust, I suggest 512-bit keys. That'll keep your secrets safe for about 9E73 years. I'd also suggest finding a new planet to live on, the mining operation would likely be somewhat disruptive.

For a more realistic comparison, perhaps they've only got a computer with as much mass of iron ore as the recent annual world production for the last thousand years (2.5E9 tonnes/year = 2.5E15 kg). Then it'll take around 5000 to run 2^255 operations.

https://en.wikipedia.org/wiki/Bremermann%27s_limit



> And that's ignoring light-speed communication delays between parts of the computer, which would dominate.

Light speed delays are not relevant to a highly concurrent problem. They would be an issue for a general purpose computer that size running a sequential program.


True if it's not a quantum computer. For a quantum computer, the entanglement effects propagate at the speed of light, so you can't just treat it as independent trials. Of course if it's lots of quantum computers you'd get some advantages, but you can't run Grover's on an ensemble so I'm not quite sure what the resulting complexity would end up being (you can run it on each individual QC, but I don't know how the resulting complexity would be calculated).

Either way it's a bit beyond what's economically possible for any human organization right now. And I implicitly assumed the computation is fully reversible and therefore took negligible energy.


Generalized Quantum Search with Parallelism — https://arxiv.org/abs/quant-ph/9904049

“We generalize Grover's”…

“We extend the analysis to the case of a society of k quantum searches acting in parallel”.

Disclaimer: I know absolutely nothing about the topic, but the first link I googled seems to justify my intuition that this decryption could be partitioned so that many quantum computers could run in parallel (thus avoiding the limit on speed of information transfer you are hypothesising).

> Either way it's a bit beyond what's economically possible for any human organization right now. And I implicitly assumed the computation is fully reversible and therefore took negligible energy.

Agree - I’m just being that contrary Internet!


Unless they can figure out reversible computing to the point their computer doesn't really need any power, they also have to contend with the Landauer limit, so counting to 2^255 (at current cosmic microwave background temperature) would need about 2^255 k 3kelvin ln(2) / c^2 = ~9 million solar masses of fuel (assuming perfect effeciency).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: