Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The relative priorities of && vs ||, or & vs |, match the traditional precedence in logical expressions: "and" binds more tightly than "or", just as * binds more tightly than + ("and" is equivalent to * for one-bit values, and "or" is addition modulo 2). So I think that they got this correct.

However, the precedence of & vs &&, or & vs ||, etc is a source of problems.



& vs && doesn't feel like a problem to me.

& vs == is the real problem: `(val & MASK) == CONSTANT` needs parentheses due to this mistake.

`a & (b == c)` essentially almost never makes sense (== gives you a boolean, and when dealing with booleans you can use && instead), yet that it what you get by default if omitting the parentheses.


The example given in the post is that & used to be the only conjunction operator, before && was added. Therefore, it was normal to write “if (a==b & c==d)”. While this is definitely not used anymore, the historical context is useful for explaining why the precedence of & is so low.


Off the top of my head I can’t think of anything that should bind more weakly than the equivalence.


Really? Do you think equivalence should be weaker than && and || so that you would write

  if ((a==0) && (b==0))
instead of

  if (a==0 && b==0)


I think equivalence and equality should different operators. 2==x should be a syntax error, because equivalence compares Boolean expressions (and possibly their extensions depending on the language). Equality should be checked with the customary sign, and assignment should be some visually asymmetric operator like :=. As you say, Equality should bind more strongly than the Boolean typed operations, including conjunction, disjunction, and equivalence.

Tangentially, I wonder if

  if (a+b == 0)
generates more efficient code in presently popular languages with that syntax.


Sorry, I can't follow your reasoning. Probably I'm missing some basic vocabulary, because I don't appreciate the difference between equivalence and equality. Are we still talking about comparison operators?


I think GP is calling for == to only compare boolean values ("equivalence"), = to be necessary for comparing any other values ("equality"), and := to be used for assignment. Though I don't see the purpose in that, given that two boolean values are equal if and only if they are equivalent.

Unless "equivalence" is supposed to be useful for comparing boolean expressions with unbound variables? But evaluating that would require a built-in SAT solver, to be remotely performant.

Also, just because two integers sum to 0 doesn't mean they're both equal to 0, so replacing (x == 0 && y == 0) with (x + y == 0) wouldn't be valid. Regardless, it wouldn't make for more performant code: compilers already translate (x == 0 && y == 0) into the assembly equivalent of ((x | y) == 0) automatically, without the programmer having to mess with their code.


> Also, just because two integers sum to 0 doesn't mean they're both equal to 0, so replacing (x == 0 && y == 0) with (x + y == 0) wouldn't be valid

Indeed, I forgot to specify unsigned.


That still wouldn't be valid, since either unsigned integers wrap around on addition (the default behavior of most languages), in which case nonzero values can still sum to 0; or unsigned integer overflow raises an error, in which case the transformation is dangerous unless the integers are both strictly bounded in magnitude.

Unless the integers were unsigned big-integers, in which case performing the long addition with carries would take Θ(n) time, as opposed to the simple Θ(1) operation of just checking both their bit-lengths.


Yes, the latter issue comes up when the user meant to say && but types & instead.


> However, the precedence of & vs &&, or & vs ||, etc is a source of problems.

Do you have any examples of this? In my experience you almost always want the bitwise operators to have a higher precedence than the logical operators, as using the result of logical operators as a bitmask makes little sense. Consider e.g. `A & B && C & D`, which currently is equivalent to `(A & B) && (C & D)`, but with reversed precedence would be equivalent to the almost nonsensical `A & (B && C) & D`.

Now, the precedence of == with respect to & and | is actually problematic (as Ritchie admits as well). Having `A == B | C` interpreted as `(A == B) | C` is almost never desirable. For extra annoyance, the shift operators do have higher precedence than comparison, so `A == B << C` does what you usually want (`A == (B << C)`).


It usually comes up if the user intended to write && and writes & at one point in an expression, the different precedence can produce an unexpected result. But gcc and clang have warnings for that.


The only operators with precedence between & and && are ^ and |, though. In e.g. the expression from the sibling comment, `a & b == c`, writing & or && doesn't make any difference (aside from short-circuiting). I guess it's an issue if you write & instead of && in an expression that also involves a bitwise-OR (or | instead of || in an expression that also involves logical-AND), but that seems quite rare. I expect that reversing the precedence of the bitwise and logical operators would create problems a lot more often.


> match the traditional precedence in logical expressions

arithmetic expressions?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: