Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: Why doesn't math look like a programming language?
12 points by kulikalov on May 20, 2022 | hide | past | favorite | 30 comments
Why do we still use symbols like ∑ instead of just writing it inline like sum()? Or probability() instead of P(). I mean it's page one in any "Good programmer" book: use descriptive names.

I realize that all these funny greek letters and multiline spelling quirks were accumulating in math "vocabulary" for ages because people used to write equations by hand on a confined piece of paper or clay or whatever they were using.

But isn't it possible to modernize the math syntax to use more descriptive names, variables and functions?



I am convinced whatever math people are doing is a completely different way of thinking, and math people seem to like their notation, so I defer to them.

Perhaps there is some super notation that could somehow make math accessible to more people, but I doubt it would look like code, since they are two different tasks.

Mathematicians often deal with systems of multiple equations and multiple variables. Every part connects to everything else. It's like those trippy analog and mechanical things where one part does 5 things at once.

Programming is much more heirarchal. You can deal with one layer of abstraction at a time, thinking in terms of parts that become black boxes when you aren't actively dealing with them.

Sometimes the only challenge in programming is just the size of a task, and it's more tedious than challenging.

Math seems to demand a new way of thinking for basically every task.

Math as code would be amazing for people like me, who basically never at any point have a need to actually do any math beyond copy and paste. But my opinion doesn't matter much, since I'm not actually doing anything interesting, and current notation isn't a significant part of my life.

If I need to solve something, xcas or Maxima understand descriptive names just fine.

If I needed to understand the things in a proper math paper... I would probably be doing the stuff the math people do, and I'd think the notation was great, just like they seem to.

I don't know how they keep track of the meaning all those symbols. Maybe they don't, and the abstractness is useful?


I like what you wrote and here is my perspective as a mathematician.

A lot of the notation is due to legacy and common practice. For instance, we write y=f(x) and think of x as the independent variable. If I switch it around and write x=f(y) that would confuse a lot of people (especially students and non experts). In one sense it doesn’t matter what letters I use but in terms of how things are normally communicated it does. If I break with tradition this causes a cognitive burden on the reader and that detracts from the purpose which is to communicate ideas. Also, the people who come up with a new idea tend to be the ones who decide on the notation.

After using it for a while one acquires an intuition about the notation. When I see lim I think an infinite process is being described. That could be the limit from calculus or the direct limit in algebra. The result of how things have developed is that we end up with some things that are nonsensical if taken literally but do make sense as they are used.

For instance it is nonsensical to say that x=1 is the solution to x+4=5. What is really meant is that both equations have the same solution set and it’s easy to see from the first equation that this solution set is {1}.

Shakespearean language is weird to modern speakers of English even though it is correct English. In a similar way I think mathematical notation/usage is like a spoken language. One gets a feel for it after learning it and using it for a while. There are unwritten rules that people abide by to facilitate communicating ideas. I imagine a similar thing is true for programming languages.


Why don't you try and show us the result? After one page of doing probablity theory you will understand why P(X<Y) is better than probability(X<Y) (or should it be P(random_variable_defined_on_page_7 le random_variable_defined_on_page_23)) ? Do you accept this funny char "<" ? What convention for variable names should I use? Math people prefer to do math than to argue about CamelCase or snake or whatever.

We have this discussion on HN every few months. Math is not notation, if someone can't get and remember what \Sigma means writing it as "sum" won't help. Yes, sometimes people use cryptic symbols or notation, as many people write unreadable code.


I'm reading HN regularly and didn't see similar discussions. You are right, I should have searched for it before submitting. There's no need to be toxic though, I was genuinely curious.


Sorry, I did not mean to be toxic. But I really believe that anyone complaining about math notation should try to propose something and show us a page or two to judge. I would also appreciate some examples showing possible improvements in notation. I don't understand what's wrong with Sigma, I find this notation beautiful, as well as integration sign. Math is still taught at the blackboard and the notation works fine there. See for example https://www.youtube.com/c/FredericSchuller - he is very careful to explain all notation he uses and he is writting down more text than the usual professor. In my opinion it shows that math notation is fine and the problem is that ideas behind are complex and hard to grasp.


Math notation is often 2-dimensional and nested. People often don't just write ∑, they write ∑ and then write stuff above and below that ∑ in smaller letters. The they will nest statements with that statement and so on. Something like this statement below (admittedly random 'gibberish') is almost impossible to read (or write) when written out linearly, but would quite easy to visually write and parse if written out in standard math notation.

sum(div(integ(0,inf(),pow(n,-x*pi()),x),sum(p,n,q),0,n))


I mean it's page one in any "Good programmer" book: use descriptive names.

Even programming has lots of this sort of shorthand going on going on. You probably use several of them every day without really thinking about it. We write "def" instead of "define", "struct" instead of "structure", "var" instead of "variable", "mut" instead of "mutable", "puts" instead of "put_string", "std::vector" instead of "standard_library::vector", and you can probably think of 50 more if you put your mind to it. I'm sure even you have named a loop index variable "i" or "idx" at some point. Are you also advocating we change all of these to make them more descriptive?


demands for accessibility always try to smuggle in the idea that X is impenetrable to newcomers.

who cares. I dont care about newcomers. i dont care what they think is hard. if they are any good they will get over it and find out what is actually difficult with programming.

people with no sway or impact dont get to scream confusion and force the competent to alter anything.


Off the top of my head:

- More international, I can read an equation from a french mathematician without issue

- Historical, you point out this out to an extant, but part of this is that to read papers you need to know the notation anyway.

- When reading a paper in mathematics I will often extract some equation from the text and play with it to better understand the idea on paper, again, this is easier with a more terse notation.

- Most importantly, mathematics is a different process than software engineering. Mathematicians still write more than they type, particularly on chalk boars. This is not a point to be hand waved away. Mathematics usually involves a lot of writing and playing with these symbols in notepads and on blackboards either alone or in collaboration with other mathematicians. The speed and conciseness from the symbols helps a great deal in this process. Switching to more descriptive forms in text would just end up confusing as people try to translate between symbols and text.


Re-stating your last point: Computer code is read more than it is written, but math may be written more than it is read. That leads to a different sweet spot on the terse-verbose axis.


It's not lack of paper that's the limit here. It's how much you are able to see at once. Personally I feel like the notation is not the problem. However what is a problem is that mathematicians have a tendency to use notation without defining it (and mathematically notation is not as standardised as you might think).


Standard notation like sigma for sum isn't really a descriptiveness issue, everyone knows what that means. Not dissimilar to programming languages using `+` instead of `plus()`. Non-standard variables are where more descriptive names would be useful, though the tendency to use single letter (greek or otherwise) names is reinforced partly by writing `ab` for `a*b`.

Also much of math notation is effectively 2d. Programming langs use linear text. Latex syntax is sometimes used for a linear-text representation (even if it's not intended to be rendered), but it reduces readability.

And people still write equations by hand a lot, because the notation is more efficient than typing it out.


LaTeX has at least solved the concatenation problem: ab means a*b, whereas \text{ab} is a word.


I literally just mentioned that. LaTeX is AFAIK a language created to describe mathematical notation


> But isn't it possible to modernize the math syntax to use more descriptive names, variables and functions?

That's exactly what programming is. The issue is, programming languages do a lot of other things that are irrelevant to mathematics, and the syntax is so much more verbose that it doesn't make sense for pure math. You don't want control structures interrupting your train of thought every other line when trying to understand an equation, analogous to spaghetti code filled with gotos.


I would rather prefer the other way around and see APL more used for math-oriented programming tasks …


Why? Isn't it harder to learn and write?


APL is often said to be easier for non-programmers, because it's a mathematical notation and doesn't deal with control flow. Dataflow languages in general seem to be more popular among people who don't already know conventional programming languages.


Probably yes if you have learned programming starting with procedural languages. There is a steep learning curve in the beginning but even if you will not use APL for "normal" projects, the way an array programming language makes you think about data transformations is worth the effort. And it is also quite practical for smaller to medium tasks involving mainly mathematical operations, e.g. data analytics. Not so for mainstream / web / systems etc.

After having a grip on the notation, I would say the time/effort to put an algorithm or idea into code is probably comparable to other languages (if not less).


Err, the math symbols are language independent. No guarantee that the words for sum and probability start with s and p. Why switch one arbitrary set of symbols with another.


But it is very much its own language. With the same flaws as any as you need to know the mathematical context for symbols to make sense. There is a meaning behind them, but they often mean different things depending on the field and sometimes even contradict each other.


Hmm... I would guess there are more people in the world who knows what "sum" means rather than "∑"


So, since when is math not for mathematicians. Who cares about median understanding.


Who cares what random people know? People using math know what ∑ means.


I think it's an interesting question.

Why stick with 2D math? If you're senior and already trained your brain to work in 2D math then it's a no brainer to continue to exploit what you know. If you're young, 2D maths is orthogonal to programming languages and I think very much worth the investment of practice: work on problems from a different perspective; very dynamic, domain and scale free; work offline; utilize large amounts of learning material; read early literature.

Why does 2D math break naming conventions we have with programming languages? You don't get autocomplete with pen and paper; you have to write out intermediate steps; and there's no backspace key. So offline bytes are expensive today as they were 100 years ago, whereas online bytes in programming languages have dropped exponentially.

Why does 2D math not stick to 1d linear forms? Notation inventors design their languages to take advantage of what the tools can do. Languages looked different before we had syntax highlighting. The tools for 2D math (pen+paper) do not have many features of the IDE, but they do have 2 dimensions with variable cell sizes. 2D math notations exploit those to make writing (and doing intermediate computations) easier.

Why not build programming languages for every construct seen in mathematics? People do! You've got symbolic languages like Mathematica to matrix languages like APL. If you've seen it written in 2D math, someone's probably turned it into a language or library.

What will the future of 2D math look like? I expect much like the present. Maybe someone will discover some brilliant new trick, but I don't think that likely. Perhaps with better hardware and AI we'll have some advanced Remarkable-like Tablets that blend the best of 2D handwritten math with the best features from IDEs. I expect 2D math has probably plateaued and will remain around as a useful orthogonal tool until AI takes over most jobs from humans (In the end both 2d maths and programming languages might not matter—it might just be binary + AIs running the show).

Recommended reading:

- A History of Mathematical Notations by Florian Cajori (1929)

- Syntax-Directed Recognition of Hand-Printed Two-Dimensional Mathematics https://dl.acm.org/doi/abs/10.1145/2402536.2402585 (1967)

- A Review of Two-Dimensional Programming Languages (1972) https://dl.acm.org/doi/10.1145/942576.807009


I suspect because mathematics has a longer history and although the specific notation has evolved for things, going against the grain in a single paper is going to be hard.

Using computers more and more to prove things might change this though?


Whilst I sympathise, putting everything inline would have its problems. For example division symbols become much less clear and require parentheses. Exponentiation is easy to appreciate done in the math style. And so on.


Sympy is your friend. It has all kinds of squiggles. https://www.sympy.org


Erh. I thought you wanted more squiggles, not less. My bad.


I think what you’re looking for is precisely LaTeX




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: