> that traditional mathematical convention requires single glyph variable names due to the unfortunate decision to save paper by denoting the product by juxtaposition.
It's not to save paper or because of the product. You don't know the solution to the problem you are working on from the beginning and most of the time is spent writing and writing and writing in a scratchpad trying to solve what you need. Anything longer than a single glyph for variables would be too tedious so everyone evolved to use single letters. And then the papers are written with the same convention since it's natural. You have variable names though with the use of subscripts with the added benefits that it can be (and is) used to elegantly group relevant variables together giving you some sort of abstraction
I once wrote a comment about it here on HN - language in maths is not a programming language used to tell a computer how to go from A to B, but a natural language used to talk about maths between peers. Every natural language have idioms, inconsistences and other quirks. Polish will not change for you so it's easier for you to learn it, it will change in the way that let's polish people communicate better with each other which also include a lot of historical and cultural happenstances. Same with maths
There are attempts like Esperanto and other artificial languages like that and I think any attempts at 'codification' of maths to use some programming language has the same chance of success of wide adoption
> They are attempts like Esperanto and other artificial languages like that and I think any attempts at 'codification' of maths to use some programming language has the same chance of success of wide adoption
Aren't existing programming languages already types of codified artificial math dialects which have seen wide adoption
Programming languages are more for humans than for computers. Otherwise we’d be writing our programs in 1s and 0s, and extending our editors in Emacs Binary and VSCode BinaryScript.
> language in maths is not a programming language used to tell a computer how to go from A to B, but a natural language
Right, we're on the same page, I just think this is a bad thing and you evidently think it's a good thing. I'm well aware many mathematicians don't, because it's how they were trained and unlearning is the hardest kind of learning. The ambiguity[1] of natural language is observably ill-suited for formal reasoning, and the experience of computing science has shown this conclusively.
Do bear in mind that the pioneers in our field were virtually all trained mathematicians. They were well aware of the historic faults of the field because having to make programs actually work forced them to be.
The legacy fuzzy pencil and paper approach of traditional mathematics is going to end up being to proper formal mathematics just as what's now called philosophy is to formal logic.
It's not to save paper or because of the product. You don't know the solution to the problem you are working on from the beginning and most of the time is spent writing and writing and writing in a scratchpad trying to solve what you need. Anything longer than a single glyph for variables would be too tedious so everyone evolved to use single letters. And then the papers are written with the same convention since it's natural. You have variable names though with the use of subscripts with the added benefits that it can be (and is) used to elegantly group relevant variables together giving you some sort of abstraction
I once wrote a comment about it here on HN - language in maths is not a programming language used to tell a computer how to go from A to B, but a natural language used to talk about maths between peers. Every natural language have idioms, inconsistences and other quirks. Polish will not change for you so it's easier for you to learn it, it will change in the way that let's polish people communicate better with each other which also include a lot of historical and cultural happenstances. Same with maths
There are attempts like Esperanto and other artificial languages like that and I think any attempts at 'codification' of maths to use some programming language has the same chance of success of wide adoption