Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Another way to look at this is just how efficient the human brain is for the same amount of computation.

On one hand, we have racks of servers (1920 CPUs and 280 GPUs) [1] using megawatts (gigawatts?) of power, and on the other hand we have a person eating food and using about 100W of power (when physically at rest), of which about 20W is used by the brain.

[1] http://www.economist.com/news/science-and-technology/2169454...



Definitely not anything close to gigawatts: ALL of Google (not DeepMind, but the entire company) uses only about 260 megawatts.

Probably on the order of one megawatt or so.

http://inhabitat.com/infographic-how-much-energy-does-google...


Still, wow! And a human is using 100W or so: (~400 max) https://sustainability.blogs.brynmawr.edu/2012/07/31/underst...


And don't forget, the human grows organically from a complex system of DNA that also codes for way more than playing Go! And is able to perform a lot of tasks very efficiently, including cooperating together on open ended activities.


And building an AI that can play better than itself :)


On the other hand, they've only been working on Alphago for two years.


We can estimate the power consumption: 1920 CPUs as in cores, or physical packages? If the latter they are ~100W each, so that's 192kW; if it's the former, depending on how many cores per package, a fraction of that. The GPUs are likely to be counting physical packages (and not cores, which is much more) and they draw 300-400W each, for a total of ~300kW. Add a bit of overhead and I'd say 500kW (half a megawatt) is a good rough estimate.


The figures quoted for the cluster are about 25 times higher than those quoted for a single machine, so I would guess the cluster consists of 25 machines. 20 kW, or 166 amperes at 120 volts, per machine seems a bit high to me.


Sure, but AlphaGo will probably evolve much faster. In a few years it will run on much smaller devices, as happened to chess programs.


Exactly - the watts comparison is a bad one. Stockfish running on an iPhone (~5 watts?) can play world class chess.


A single A15 core at around 1Ghz has more Gflops of power than deep blue had across it's whole system (11.38Gflops).

1920 CPUs (a 4-core haswell from 2013 is around 170Gflops). 280 GPUs (previous gen Nvidia K series peaks at around 5200GFLOPS). That's 1,782,400Gflops or around 150,000x more processing power. If they were running latest-gen hardware, then the would be closer to 200,000x faster.

Given that Moore's law is slowing down and the size of the system, we're a long way from considering putting that in a smartphone.


People are focusing way too much on the current hardware that AlphaGo happens to be running on.

AlphaGo is still a very new program (two years since inception). It will get significantly better with more training, or, equivalently, it will stay at the same strength while running on much less hardware.

Don't read too much into what one particular snapshot in its development cycle looks like. Humanity has had hundreds of millions of years to maximize the efficiency of the brain. AlphaGo has had two years. It's not a fair comparison, and more importantly, it's not instructive as to what the future potential of AI algorithms looks like.


It should use less that 0.5 megawatt, 1920 * 150W (High end server CPU) + 280 * 300W (Nvidia Tesla cards)


We need to keep our brain running 24x7x365, though, so in the long run, deep mind is more efficient.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: