Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To say there's no magic in the brain drastically *minimizes the complexity of the brain.

Your brain is several orders of magnitude more complex than even the largest LLM.

GPT4 has 1 trillion parameters? Big deal. Your brain has 1 quadrillion synapses, constantly shifting. Beyond that the synapses are analog messages, not binary. Each synapse is approximately like 1000 transistors based on the granularity of messaging it can send and receive.

It is temporally complex as well as structurally complex, well beyond anything we've ever made.

I'm strongly in favor of AGI, for what it's worth, but LLMs aren't even scratching the surface. They're nowhere close to a human. They're a mediocre pastiche and it's equally possible that they're a dead end as it is that they'll ever be AGI.



That kind of explains why humans need to absorb less language to train. It still takes 25 years of focused study to become capable of pushing the frontier of knowledge a tiny bit.


Re: synapses being analog messages, isn’t this sort of true of neural networks? In my understanding, the weights, biases and values flowing through the network are floating point numbers so I’d argue closer to analog than binary.


You know that's a good point that I didn't consider in my comment. LLMs use 16bit floats, which is approximately the same number of values.

But then you get into the nuance of spiking vs non spiking neurons etc, which to my knowledge isn't emulated. The brain is extremely complex. I don't believe LLMs do anything similar to inhibitor neuronal function for example.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: