It seems likely that anything that can be expressed in "any programming language" is isomorphic to something that exists in math and can be expressed in that language (automata theory, type theory, category theory, pick your poison).
I am implying that Mathematical semantics are broken/incomplete.
Mathematics uses denotational semantics. It implies pass by-value and pass-by-reference are equivalent. They are in theory, they aren't in practice. That is why Haskell's lazy evaluation leaks time.
Engineers/physicists intuitively rely at least on operational semantics. It's a higher order logic, and since Linear Logic time is localized.
So as a brief summary: laziness leaks time. Whether that's a good or a bad thing depends on what you want/need. It may be a bad thing if you are doing cryptography because time leaks allow side-channel attacks ( https://en.wikipedia.org/wiki/Timing_attack ).
As with all philosophical squabbles - there are arguments for each side. It's up to you to make up your mind based on your particular use-case.
And so in context of crypto + Haskell, have a look at these two posts:
I was more wondering how time leaks and laziness relates to denotational vs operational semantics? I couldn't find anything about either in the links (they seem to be general descriptions of haskell and laziness..?).
Trivially. For any given system 'safety' is an operational concept, not a denotational one.
You can't formalize the notion of 'safety' let alone prove (in the Mathematical sense) that your code has it without examining its runtime behaviour.
In the words of Donald Knuth: Beware of bugs in the above code; I have only proved it correct, not tried it.
In one context lazy evaluation may be 'safe' - it another it may be 'dangerous' - the context in which this assertion is made is always about human needs and expectations, not mathematics.
With particular example being that lazy evaluation allows for side-channel attacks in cryptographic systems. That's undesirable - hence operationally 'unsafe'.
>The observer effect in physics can be trivially expressed and evaluated as a mutating getter in any programming language.
>It cannot be expressed in any Mathematical grammar.
What on earth are you trying to say...? There is a whole mathematical treatment of quantum mechanics, measurement effect included (several treatments). More to the point, any algorithm can be formalised in e.g. a Turing machine, or any other equivalent universal model of computation. So there's literally nothing in "computer science" that you cannot express in mathematics.
This reeks of Dunning-Kruger: you don't even know enough about what you're talking about to realise how badly little sense you're making. I strongly suggest you familiarise yourself with a topic before opining about it. I'll be hard-pressed to believe you ever opened a physics textbook in your life after high school. Why then do you feel the need to talk blindly about it? I never opened a book about aeronautics in my life, therefore I would never presume to lecture anybody about plane-building... That would just be arrogance.
it's like... kinda true that the observer effect can be expressed as a mutating getter... but that isn't useful. The idea that it can't be expressed is really ignorant.
>This reeks of Dunning-Kruger: you don't even know enough about what you're talking about to realise how badly little sense you're making. I strongly suggest you familiarise yourself with a topic before opining about it. I'll be hard-pressed to believe you ever opened a physics textbook in your life after high school. Why then do you feel the need to talk blindly about it?
You can just address their ideas, this extra stuff isn't helpful.
>>The lesson is for those “experts” who “know” that all reasonable models of computation are equivalent to Turing machines. This is true if one looks just at functions from N toN. However, at higher types, such as the type of our function m, questions of representation become important, and it does matter which model of computation is used.
Mathematics only proves things up to isomorphism. Two things that are theoretically equivalent are not necessarily empirically equivalent.
Or, just the good ol' adage: in theory there is no difference between theory and practice, but in practice there is.
>So there's literally nothing in "computer science" that you cannot express in mathematics.
In addition to the examples in the blog post linked above, you cannot express a mutating getter in Mathematics.
The link you has its fallacy in the premise. It's defining "function" as "computable function", then complaining that many results that follow do not agree with the former definition! Either I'm really missing something or this is some poor attempt at trolling / bait-and-switch. Of course you have problems with that "computable function" definition when dealing with functions from Real -> Real, since many (in fact, almost all) real numbers are not computable!
Re mutating getter:
Monads for example are a standard way of formalising state in a pure-function universe. You seem to have some fundamental misconception about what "mathematics" is, since you keep repeating something about purity. Mathematics is merely the rigorous study of formal systems, of which your mystical "mutating getter" is one such system.
You have some deeply flawed (or entirely absent) philosophical upbringing. There are no "foundations" to anything - all definitions are arbitrary.
To assert that a premise is fallacious mandates that you have some prior notion of "fallaciousness".
You don't have an objective criterion for asserting whether one definition is better than another because you don't have a notion of "betterness" - it's all conventional.
It's precisely because I am an over-zealous formalist is why I see Mathematics for what it is - grammar, syntax and semantics.
Mutating getter? Sounds like a generator function that yields. Or a pseudo random number generator implemented with a monad (WorldState, Value). Or any tail-recursive non-void function that carry state without loss?
I'm definitely not an expert but aren't irrational numbers are the best examples to show that these types of constructs already exist in math? You can't calculate them in finite amount of time. You use an analytical formula or series expansions to get the nth term instead.
I just explained three different models to represent side-effects without leaving the purity. I'm not sure if you are trolling, throwing random big words or simply have a really misguided impression about how computer science connects with math.
What is special about a mutating getter? It operates on the state of the object S and outputs a new state S' and some value x. Some programming languages allow you to write 'functions' without explicitly passing all input state into the function, but does that imply anything fundamental or interesting about mathematics or computation? Is that anything more than a type of syntactic sugar?
What is "special" is that the system has a peculiar property: f(x) != f(x).
If we claim to be subscribed to denotational (Mathematical) semantics then the above contradicts the identity axiom.
And it's not any deep and world-changing insight either - it's obvious to anybody who sees that the LHS and RHS are only evaluated at runtime - they don't have any inherent (denotational) meaning, which is why I keep harping on: programmers use different semantics to mathematicians.
We care about things like interaction and control flow structures as first class citizens - those are precisely the things that have no mathematical equivalents. Timeouts, exceptions, retry loops.
It's not "syntactic sugar" - it's necessity for reifying control flow. In the words of the (late) Ed Nelson: The dwelling place of meaning is syntax; semantics is the home of illusion.
Of course it isn't. OP seems to have some kind of misconception about what "mathematics" means, since they keep fixating on a vague concept of purity of notation.
Ceteris paribus there is one significant distinction between Mathematics and Computer Science: state.
The Mathematical Universe is immutable/stateless/timeless. The Computational Universe is mutable/stateful/time-bound.
The observer effect in physics can be trivially expressed and evaluated as a mutating getter in any programming language.
It cannot be expressed in any Mathematical grammar.