Another way to look at this is just how efficient the human brain is for the same amount of computation.
On one hand, we have racks of servers (1920 CPUs and 280 GPUs) [1] using megawatts (gigawatts?) of power, and on the other hand we have a person eating food and using about 100W of power (when physically at rest), of which about 20W is used by the brain.
And don't forget, the human grows organically from a complex system of DNA that also codes for way more than playing Go! And is able to perform a lot of tasks very efficiently, including cooperating together on open ended activities.
We can estimate the power consumption: 1920 CPUs as in cores, or physical packages? If the latter they are ~100W each, so that's 192kW; if it's the former, depending on how many cores per package, a fraction of that. The GPUs are likely to be counting physical packages (and not cores, which is much more) and they draw 300-400W each, for a total of ~300kW. Add a bit of overhead and I'd say 500kW (half a megawatt) is a good rough estimate.
The figures quoted for the cluster are about 25 times higher than those quoted for a single machine, so I would guess the cluster consists of 25 machines. 20 kW, or 166 amperes at 120 volts, per machine seems a bit high to me.
A single A15 core at around 1Ghz has more Gflops of power than deep blue had across it's whole system (11.38Gflops).
1920 CPUs (a 4-core haswell from 2013 is around 170Gflops). 280 GPUs (previous gen Nvidia K series peaks at around 5200GFLOPS). That's 1,782,400Gflops or around 150,000x more processing power. If they were running latest-gen hardware, then the would be closer to 200,000x faster.
Given that Moore's law is slowing down and the size of the system, we're a long way from considering putting that in a smartphone.
People are focusing way too much on the current hardware that AlphaGo happens to be running on.
AlphaGo is still a very new program (two years since inception). It will get significantly better with more training, or, equivalently, it will stay at the same strength while running on much less hardware.
Don't read too much into what one particular snapshot in its development cycle looks like. Humanity has had hundreds of millions of years to maximize the efficiency of the brain. AlphaGo has had two years. It's not a fair comparison, and more importantly, it's not instructive as to what the future potential of AI algorithms looks like.
On one hand, we have racks of servers (1920 CPUs and 280 GPUs) [1] using megawatts (gigawatts?) of power, and on the other hand we have a person eating food and using about 100W of power (when physically at rest), of which about 20W is used by the brain.
[1] http://www.economist.com/news/science-and-technology/2169454...