In that case, the GCC extension is part of the input language (in a broader sense).
Still, your example clearly shows the difficulties involved in practice.
The set of all possible inputs for a compiler is infinite, too. Does that mean that compilers are all harangued by the halting problem as well? Nope. Having an infinite number of possible inputs (infinite input set size) does not prevent you from showing that a program can actually halt. I think most compilers are written in a way that can be shown to halt, as well.
The Halting Problem does apply in the general case, but if you carve up your programs and reason about them, you can still show that you can have a halter. The Halting Problem just states that there does not exist a method that will take an arbitrary program and show it to be a halter.
The point about the macros being Turing Complete is trivial in this instance because if you wrote a macro that never terminated, you'd never compile anything do decompile ;-).
No, you are incorrect about the point being trivial. First off, the macros used in the original program will of course terminate, but if you simply "try every input to the compiler" like is being suggested, you will be attempting to compile any number of non-terminating macros (and you will of course not know which terminate and which do not (Halting Problem)).
The point is that "compilers terminate for every input" is trivially false.
Non-termination of some compiler inputs is a red herring. The only modification required to the original algorithm is that the search proceed in parallel (such that every input eventually makes arbitrary progress).
I agree with your point in a sister thread that decidable doesn't imply practical, but the claim of the original article was undecidable, which has a technical meaning.
I would qualify that statement better by saying "compilers with Turing Complete macros or type systems terminate for any input is trivially false". The point may not be trivial (which I'm not ready to concede), but it still doesn't matter. The author's means implicitly relies on an evasion where programs aren't written in this (rather strange) way. It's a safe assumption to say that you can eliminate programs written in such a way that will not compile, then the Halting Problem as you've stated it doesn't apply.
I think you may be right about the point your trying to make, but the point I'm making is that your point covers a negligible section of programs that the author would hope to analyze, so it doesn't matter.
My point is not about checking if the given program will halt or not.
The parent comment's suggestion was: (1) for each possible input program, (1.a) compile it, and (1.b) check if the result equals the given compiled code.
Agreed, steps (1.a) and (1.b) terminate deterministically (for a given compiler).
However, the search space for this search procedure is infinite.
Similarly, it would be impossible, in general, to exhaustively test every possible input of a compiler.
If you know the compiler (and every other tools involved in the process), then yes, there exist at least one solution (the original input). The search will eventually find it.
Assuming all possible inputs to the compiler cause it to halt. Not true for many high-profile languages.
And while not strictly related to "is it decidable", in the real world we have to keep in mind the complexity of the solution. The presence of the naive solution for a particular language doesn't say much about how well we can* actually do it. Is the problem actually decidable within the confines of the physical observable universe for real world inputs?
"Assuming all possible inputs to the compiler cause it to halt."
If the compiler did not halt, then we don't have anything to disassemble in the first place.
Formally, it is obvious that a diagonalization-based search will eventually find a correct input, regardless of the halting status of any given input. In practice, none of this matters very much.
I agree, but I would like to quote the author : "This isn’t an article where I try to convince you to write your code like I write mine."
He does not really try to force his style.
I respect his approach, because even if the article seems to be about aesthetic preferences, I think that he is concerned by the lack of curiosity and education that may be shown by some programmers. I think what the author really says is : RTFM. None of the comments here focus on the conclusion of the article, which is the most important part:
"Cozy up with some hot chocolate and the ECMAScript spec some Saturday afternoon. Practice a little. Play around with some test programs. It’s a good time.
Or don’t do that, if you don’t feel like it. It’s your life. You almost certainly have better things to do with it.
Just please stop making authoritative claims like “terminate all lines with semicolons to be safe.” It’s not any safer, or more reliable."
Some programmers like to know everything about a language they are using, knowing all the edge cases, keywords no-one else uses, the exact execution order of tokens and weird quirks. They are masters of complete understanding, and very occasionally this mastery allows them to come up with a revolution in the way things are done. Let's call these the Scholars.
A lot of others don't though, the joy of programming for them is not mastery of the grammar but writing the story. They would rather use a combination of run it and see and diving into the documentation only when totally necessary. Do not mistake this for an amateur who is cargo culting their code, these programmers know exactly what their code does, they just learn the bits they need as they need them. Let's call these the Mavericks.
I struggled to find neutral words for both schools of thought, don't read too much into the names I picked.
Both types can be excellent hackers. And of course we have a spectrum in between Scholars and Mavericks.
I am a Maverick. I find the Scholars, with their memorization of all the rules, to be both awe inspiring and entirely tiresome at the same time. I don't enjoy sitting down with a cocoa and a spec because I'd prefer to be programming. There is a vast swathe of every spec and every library that is entirely useless for the task I have at hand, quite often for any task I would ever do. I see anything which forces me to learn that useless knowledge as a waste of time.
Use semi-colons for every statement is a simple and easily remembered rule, one which takes no effort compared to reading a spec that delivers me little or even no benefit. I will also tend to forget these rules because my mind just doesn't see it as particularly useful information, I don't use it enough. I don't have any motivation to force that knowledge in my brain through learning practices because to a Maverick, it's boring.
There is no lack of curiosity or education in the latter type of programmer. I read this type of article which Scholars so often write with exasperation. Mavericks just don't care about learning the intricacies of language grammar compared to doing something else in programming. That something else is just as intellectually demanding.
Please remember that the next time you come across a Maverick, we're just of different schools, not of different ability.
If you are a Maverick, then that's fine. Write code however you like. Seriously, I care less than anyone.
Just stop talking trash about how I do things. If the story is more interesting than the grammar, then shut up about grammar and go back to writing stories.
My problem is with telling people that there is a problem when there is no problem, and at the same time calling yourself a scholar.
If you don’t understand how statements in JavaScript are terminated, then you just don’t know JavaScript very well, and shouldn’t write JavaScript programs professionally without supervision
You obviously do care to such an extent that you're saying that we shouldn't even be programming without you looking over our shoulder.
I appreciate your reply. I am not sure if I really am on the Scholars side of the spectrum myself, but I surely love to know how to use my tools appropriately (Googling instead of memorizing everything is efficient). There was an article recently on HN (http://news.ycombinator.com/item?id=1906070) related to "bad" habits in Perl (What about using the safer, clearer three-parameter form of open?) : as you and other expressed it more clearly than I did, this is the Cargo Cult side of programming that bothers me. Also, I was disappointed that most of the comments seemed to fight over stylistic considerations whereas the point of the author was "Just please stop making authoritative claims", e.g. uninformed claims; but don't get me wrong, I clearly noticed that people here actually know quite well their subject, even when they choose the always-use-semi-colon side of this war.
I have a problem with the way the author puts forward his position: "I am sorry that, instead of educating you, the leaders in this language community have given you lies and fear. That was shameful."
As folks have said here, there is a valid alternative position. He's arguing as if there's no valid alternative ... which is problematic.
Annoying? Poor usability? It looks like a button, reacts on mouseovers, and would totally fit in some game's website. Besides, it is not the point here.
My thoughts too. While that might be a silly button to have for many sites, it looks like something I've seen in more than one video game menu. So, it's not without merit.
The original argument is not about the minifier, but about semicolons.
By the way, why would your better minifier introduce a semicolon here ? This is obviously a useless character in this example.
Also, I don't understand how adding a semicolon could help against the "stupid error" you cite. Could anyone please explain ?
I don't actually remember what closure advanced mode does. It's quite possible it does remove ";" in some instances. But really, you either need a newline or a ";" to mark the end of a statement, so the question is fairly moot - it's gonna be 1 byte either way.
If you always put semicolons to mark the end of a statement, then seeing a statement without one would be like seeing a megablocks brick in your lego box - glaringly obvious and extremely offensive.
IMHO that backeting style sucks, and shouldn't be used, at least in js.
If you write one statement per line, the code is very clear, and there is no need for semicolons. It is after all possible, according to the syntax, and a minifier should be able to understand it, as well as your browser.
I am certainly not against coding practices or readability. With (Q)BASIC, you could also add semicolons and have multiple statements per line, but who did ? It seems that it is the same with Javascript, semicolons are separators between statements, as well as newlines : why write both ? just to be sure they are well separated ?
it may be 'clear' to you, but as has been pointed out, it can be ambiguous as to what the code will do. The example with a line starting with an open bracket is one such example.
Why force yourself to remember edge cases, when you can just remove that whole class of bugs by being explicit about what you mean...
That theory is so simplistic. It is based on a very selfish and static vision of a relationship : "the other must think like me", "people never change". Arguing, debating, ... with people about your respective points of view do not prevent you to like/love them.
If you always hang out with the same boring people who always are on your side, how will you learn new interesting things outside of your usual points of interest ?
".. while this approach [randomness] has actually been proven to be a disappointment .. "
I did not see any reference linking randomness and divorce rates, even as being merely correlated. Where is the "proof" ?
I would not rely on that theory in practice, because it abstracts too much of the existing complexity between people, and do not solve the real problems in relationships.
As a filtering algorithm for dating sites, it may give many false positives (you do not like that person, who still shares many of your interests), and may prevent you to meet a "perfect" girlfriend, only because she is/seems/looks radically different than you.