Thank you for this correction and additional perspective.
The Debian vulnerability was particularly bad. An AES key with 16 bits of entropy can be broken with the energy used by a single LED for a fraction of a nanosecond.
Reducing entropy covertly is probably the sole purpose of the so-called Intel Management Engine
I'm not sure the Debian vulnerability affected AES keys, but it definitely affected RSA keys.
A single LED is somewhere between 1 milliwatt and 1 watt, so in a tenth of a nanosecond it uses between 100 femtojoules and 100 picojoules. 2¹⁵ AES encryption operations currently require a lot more energy than that. I'm not sure how much, but it's a lot more.
How much does an AES encryption operation take? https://calomel.org/aesni_ssl_performance.html suggests AES-256-GCM runs at 2957 megabytes per second on each core of an "Intel Gold 5412U", which https://www.intel.la/content/www/xl/es/products/sku/232374/i... tells me is a 24-core CPU launched in Q1 of 02023 with a TDP of 185 watts. https://en.wikipedia.org/wiki/Advanced_Encryption_Standard says the AES block size is 128 bits, so 2957MB/s is 185 million blocks per second per core. Dividing 185 watts by 24 cores of that gives 41.7 nanojoules per block. This is probably reasonably representative of energy requirements for current AES hardware implementations. It presumably doesn't include key setup time, and brute-force cracking will do more key setup than normal encryption, but it's probably in the ballpark, especially for dedicated chips ticking through closely related keys. In any case, key setup surely cannot take less than zero energy, so this represents a lower bound.
Running
openssl speed -elapsed -evp aes-256-gcm
on my own laptop (without -evp, I get "speed: Unknown algorithm aes-256-gcm"), I get 3900 megabytes per second for large block sizes, or 2300 megabytes per second running on battery power.
The 'numbers' are in 1000s of bytes per second processed.
type 16 bytes 64 bytes 256 bytes 1024 bytes 8192 bytes 16384 bytes
AES-256-GCM 353329.81k 1012347.01k 2190564.18k 3178319.19k 3791358.63k 3676427.61k
According to
cat /sys/class/power_supply/BAT0/power_now
I'm using about 12–16 million microwatts to do this, compared to about 6–8 watts when idle. So we can ballpark the AES energy consumption around 7 watts. Dividing that by 2300 megabytes per second, it comes out to about 49 nanojoules per block. This is reassuringly similar to the calomel numbers.
The number for 16-byte blocks is much lower, like 240 megabytes per second on battery and 360 megabytes per second on AC power. This probably tells us key setup takes about an order of magnitude more energy than encrypting a block, but maybe that's just because AMD was optimizing encryption speed over key setup speed.
2¹⁵ times 40 nanojoules is 1.3 millijoules. This is between 13 million and 13 billion times more than the energy used by a single LED for a fraction of a nanosecond.
Also, 2²⁵⁵ times 40 nanojoules is 2.3 × 10⁶⁹ J, a couple billion times larger than your estimate upthread. It's pretty amazing than in 67 nanoseconds my CPU can encrypt something such that it would require, as far as we know, the resources of billions of galaxies to decrypt without knowing the key.
The IME is probably a backdoor, but I don't think we have enough information to say clearly what kind of backdoor.
The Debian vulnerability was particularly bad. An AES key with 16 bits of entropy can be broken with the energy used by a single LED for a fraction of a nanosecond.
Reducing entropy covertly is probably the sole purpose of the so-called Intel Management Engine