Hacker Newsnew | past | comments | ask | show | jobs | submit | marshfram's commentslogin

Are there studies of how we lost the creative edge of ambiguity? Media is a wasteland of sanitized takes, adult media is overexplained as if 10 year olds are the target.


Research results are more more than two things, have vastly more semantic load than these synthetic categories/labels.


Analog is next. Software first, then build the machines. No more models, reductions, loss. Direct perception through measurement and differences.


Analog was before, though. General computing was never realized using those architectures; granted, they were mechanical in nature, so that is a big ask, both figuratively and literally.

Maybe we could create continuous-valued electrical computers, but at least state, stability and error detection are going to be giant hurdles. Also, programming GUIs from Gaussian splats sounds like fun in the negative sense.


You've just described vacuum tube computers as well as all the early transistorized computers. Digital computing is a relatively late concept


The important difference is that all those early analog computers were either bespoke or suited for a very narrow subset of tasks, like fire control computers. They were very far from general purpose computers, and that is the reason the von Neumann architecture dominates today: we are free to change the operation of the computer without literally changing gears or re-wiring all logic paths. Before, the hardware was the software.


Of course there were analog 'computers' but vacuum tubes were also used to realize digital computers in the early days.

https://en.wikipedia.org/wiki/Vacuum-tube_computer


You have to withdraw from the binary in all senses to begin to imagine what an analog spatial differences measurement could function as.

Again, think software first. The brain is always a byproduct of the processes, though it is discerned as a materialist operation.

Think big, binary computers are toys in the gran scheme of things.


We'd need a real breakthrough in physics to have such a technology that works at a scale even remotely comparable to what a low end digital CPU can do today. The thing is, there's not even any real evidence (at least to my knowledge) that there are useful threads that researchers know to pull on that could yield such a technology. Emulating analog hardware with digital hardware in anticipation of some kind of breakthrough isn't going to have any material benefits in the short to medium term.


You're thinking small, think about software first. Analoga of differentials. Spatial. Specifics. If you get trapped in emulation, you're trapped in counting.


What does that even mean


Binary computers are toy prisons of counting that model in a reality that has no correct models, not even neuromorphic will gain validity here, leave them behind. Start over with the forms we know are valid in neural syntax and apply them to software.


That's a great sentiment but without code that does what you're describing, that means nothing. Put some code out into the world that demonstrates the idea, because otherwise it's just inspirational words without substance.


It's not a sentiment, it's real coding. Rhythms of the Brain by Buzsaki. Cinema is already a prototype of differences, using space, and coreelating topological states. It's analog software that's been running for 130 years.

Binary was always a dead end alley, we knew this going in. How do we escape it?


It's not real coding if you can't point to code and show people how to do it. Where's the GitHub link? Where are the tutorials? Where are the YouTube videos explaining why it's important and how to do it right? You're just saying words, show me something tangible that's not just a vibe.


Closed to public view.


Sure, of course.


You can start here, sports and cinema combined create the syntax of language as a code.

https://www.routledge.com/The-Constraints-Led-Approach-Princ...


How do you decentralize a network that relies on dictionary semantics, the chaos of arbitrary imagery, basics of grammatically sequence signals?

It's oxymoronic. Our communication was developed in highly developed hierarchies for a reason: continual deception, deviance, anarchism, perversion, subversion always operating in conflict and in contrary to hierarchies.

Language is not self-organizing, signaling is not self-learning it self-regulating. The web opened the already existing pandora's box of Shannon's admittedly non-psychologically relevant info theory and went bust at scale.


<<There's glory for you!>>


We should be searching for the last story.


Social media is simply an extension from cybernetics to the principles of cog-sci as a "protocol" network where status and control are the primary forces mediated. This is irrefutable - the web was built as an extension of the cog-sci parameters of information as control.

Social media can't be saved, it can only be revolutionary as a development arena for a new form of language.

"The subject of integration was socialization; the subject of coordination was communication. Both were part of the theme of control...Cybernetics dispensed with the need for biological organisms, it as the parent to cognitive science, where the social is theorized strictly in terms of the exchange of information. Receivers, senses of signs need to be known in terms of channels, capacities, error rates, frequencies and so forth." Haraway Primate Visions.

I don't understand how technologists and coders can be this naive to the ramifications of electronically externalizing signals which start as arbitrary in person, and then clearly spiral out of control once accelerated and cut-off from the initial conditions.


This really reads to me like an example of pseudo-profound bullshit, and yet I'm sure you do mean something - could you explain what?


The technology of language is designed to fool the receiver. That's its primary goal. Read any substantial text on language post western functional linguistics like Deacon's The Symbolic Species. In his view "language is a virus or a parasite"

Once language became a strategy of cybernetic and then cog-sci regimes (which is what all computer science is modeled from), the basic principle of control-deception in language became electronic through its perceptions of socialization, which comp sci totally mistakes as information (see above) and then control, accelerated and now automated. The entire point of computer science operating socialization is completely off the rails mindblowingly simple-minded and damaging to us. Algorithms a/b tested to succeed are in essence, suicidal to the survival of our species. We're not optimized to horizontalize communication of this type: arbitrary metaphors and symbols. Language wasn't built for speed, horizontalization or decentralization.

Now read the above again. To call Donna Haraway the great theorist/historian of cyborg studies and the development of science into cog science pseudo in any way reveals you have never grasped anything deep and resonant about human computer interaction.


I'm afraid I'm not convinced. In particular, there's an obvious objection to your first claim: if language were primarily designed to fool people, then it would be useless, because other people would ignore it. As for the rest, it still isn't clear what you are saying. For example: "the basic principle of control-deception in language became electronic through its perceptions of socialization". Sorry, whose perceptions? The principle's perceptions? Language's perceptions?

It seems you can't explain your ideas clearly. Maybe they just aren't clear ideas.


You're wasting your time wasting my time if you pretend you can't find the it in that sentence, one my 14 year old freshman son identified in 2 seconds. That means you're either a moron, or you play very stupid games.

That language is deception primarily can be factual along with 99% of its users remaining unable to detect that deception, and it's not even fully contradictory. What kind of scientist can't hold near contradictory processes in their working memory to reach correlational theoretic statements, certainly none that I know.

If you don't know the animal world of signals heavily discounts arbitrary forms from roles in survival, I don't know what to tell you. Go back to undergrad and start all over again.

The amount of work about language being too indirect to be valid, stable signals and thus deceptive is rather vast, and you pretending it will vanish with that little narrative shuffle "peoplw will ignore it" means you must be either doubly moronic, have no idea about the human capacity for self-deception in signals, mythological thought, or spend your days playing defensive games in debates you just can't win.

I count over 300 papers discussing the deceptive nature of language, beginning with Aristotle.

..at some point a direct contact must occur between knowledge and reality. If we succeed in freeing ourselves from all these interpretations – if we above all succeed in removing the veil of words, which conceals the truth, the true essence of things, then at one stroke we shall find ourselves face to face with the original perceptions.. Ernst Cassirer The Philosophy of Symbolic Forms


Social media relies on our dead. arbitrary signaling system, language, which once it's accelerated becomes a cybernetic/cog-sci control network, no matter how it's operated. Language is about control, status and bias before it's an attempt to communicate information. It's doomed as an external system in arbitrary symbols.


There are no world models in biology. Idea Johnson-Laird is being promoted in AI as a solution is sado-masochistic. The brain doesn't compress info about our world, it ecologically relates to it. It doesn't compress, it never has to. How these folk science ideas of the brain entered engineering from cog-sci mistaken complexes and how they remain in power is pretty suspect.


Well, it's true Charles Murray is less than his brain, so maybe others are more than them.

Sorry, the WSJ boosting op-eds from discredited, racist theorists like Murray suggests we're verging on a eugenics experiment rivaling apartheid South Africa and Rhodesia.


The drift is probably oscillatory in nature. It's process affecting material. Astrocytes and nanotubules don't directly affect the drift, they are affected by it simply as the memory is shifted by a material/process interaction.


I haven't heard that they've identified period(s) of or resonance from as a cause of representation drift.

Is this fair to say:? Astrocyte activations are more stable for a longer period of time than other neuroactivations.


It's a theory of drift postulated by neurobiologists.

Unsure if they're more stable.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: