Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Convert them to arithmetic. If you ignore the casting,

  a && b   is just   a * b
  a || b   is just   a + b
Now you remember the precedence between them (except in broken languages, of which the only notable one is shell).


> except in broken languages, of which the only notable one is shell

All binary messages in Smalltalk (messages with selectors consisting of punctuation, like !@+-, including punctuation sequences like "+-&|", if you want) have the same precedence, and the keyword variants (which short-circuit, using block arguments) and: and or: also have the same precedence (one level lower than the binary/punctuation messages), but because they take block arguments, are usually disambiguated:

    a | (b & c) "parens needed to ensure b & c is evaluated first"
        ifTrue: [self doSomething]
        ifFalse: [
                "because the and: is sent in a block arg to or:, it won't be evaluated unless or:'s receiver is false"
                (self hasPendingTask or: [self updateTaskQueue and: [self hasPendingTask]])
                        ifTrue: [self processTask]]
Smalltalk is extremely elegant, powerful, and simple. 6 reserved words, 3 levels of operator precedence, and not much syntax to learn. It's what r5rs Scheme should have been.


> Smalltalk [...] what r5rs Scheme should have been.

I'm a fan of both languages, but R5RS Scheme was to be an algorithmic language, and Smalltalk is a particular flavor of OO language (class-instance, single dispatch).

Would you say that doing conditionals and Boolean expressions with Smalltalk's object semantics and `ifTrue:ifFalse:` and mix of `and:` and `&` etc. is cleaner than Scheme's `if`, `and`, etc. syntax?

> 3 levels of operator precedence,

Scheme doesn't see what's wrong with fewer:

    (if (or a (and b c))
        (do-something)
        (if (or (has-pending-task)
                (and (update-task-queue)
                     (has-pending-task)))
            (process-task)))


By "should have been" I meant striking the balance of simplicity, power, and ease of learning that Smalltalk does. R5RS is simple and powerful (even more powerful due to macros), but not really usable, since it doesn't ship with something like Smalltalk's object model out-of-the box that streamlines creating ADTs and code reuse. Instead they give you the rudiments (functions, closures) and leave everything else up to you.


Yeah, the lack of a basic record/struct/object was felt quickly in R5RS. (Sure, you can whip up something atop vectors, or alists or other pair structures, or some arrangement of closures.)

SRFI-9 soon introduced a record type, and various Scheme implementations introduced much more.

Racket (nee MzScheme, or PLT Scheme) was one of them that introduced an object system, which was neat in some ways (e.g., mixins), but rougher in others, and thankfully it was limited to pedagogic and GUI use. There was also at least one CLOS-like. Later, Racket got a `struct` concept with some interesting hooks (e.g., inheritance/subtyping), and some simpler version of that might've been a good candidate for R5RS.

I'm don't know where RnRS is going recently, but I could imagine a fundamantal record/struct type, or interfaces more like the current Rust thinking.


Some interesting parallels between the original Scheme and message-passing from "The First Report on Scheme Revisited" by Sussman and Steele

> We were very pleased with this toy actor implementation and named it “Schemer” because we thought it might become another AI language in the tradition of Planner and Conniver. However, the ITS operating system had a 6-character limitation on file names and so the name was truncated to simply SCHEME and that name stuck. (Yes, the names “Planner” and “Conniver” also have more than six characters. Under ITS, their names were abbreviated to PLNR and CNVR. We can no longer remember why we chose SCHEME rather than SCHMR—maybe it just looked nicer.)

> then came a crucial discovery. Once we got the interpreter working correctly and had played with it for a while, writing small actors programs, we were astonished to discover that the program fragments in apply that implemented function application and actor invocation were identical! Further inspection of other parts of the interpreter, such as the code for creating functions and actors, confirmed this insight: the fact that functions were intended to return values and actors were not made no difference anywhere in their implementation. The difference lay purely in the primitives used in their bodies. If the underlying primitives all returned values, then the user could (and must) write functions that return values; if all primitives expected continuations, then the user could (and must) write actors. Our interpreter provided both kinds of primitives, so it was possible to mix the two styles, which was our original objective.

> But the lambda and alpha mechanisms were themselves absolutely identical. We concluded that actors and closures were effectively the same concept.


My problem here is that I distinguish between multiplication and addition by seeing which one distributes over the other.

  a * (x + y) = (a * x) + (a * y)
However, the reverse doesn’t work…

  a + (x * y) =? (a + x) * (a + y)
Why is this a problem?

  a ∧ (x ∨ y) = (a ∧ x) ∨ (a ∧ y)
  a ∨ (x ∧ y) = (a ∨ x) ∧ (a ∨ y)
Both are true. So, who are we to say that one corresponds to multiplication, and the other corresponds to addition? The two operations are too similar to each other.


Actually if you think about Boolean logic as mod 2 arithmetics xor is addition, rather than or.


because 1 \and 0 = 0 same as 1 * 0 = 0 and 1 \or 0 = 1 same as 1 + 0 = 1


Choosing false = 0 and true = 1 is putting the cart before the horse.

It is equally true that 1*0=0 is the same as false|true=true, and 0+1=1 is the same as true&false=false.

But it is also not true that 1+1=1, so it is probably wrong to equate 'or' with '+'. The operation has the wrong properties.

As someone who sometimes dabbles in electronics, 0 = true makes a lot of intuitive sense to me. You have your pin with an open collector, your pull-up resistor, and “true” (as in, it is true that the transistor is conducting) pulls the voltage to ground, which is 0.

As someone who uses a Unix shell, 0 = true makes a lot of intuitive sense to me.

  $ true; echo $?
  0
  $ false; echo $?
  1


You can interpret 0 and 1 as probabilities. 1 + 1 = 1 in this case makes sense because P(A or B) = P(A) + P(B) - P(A and B). You can interpret "A or B" as a set union and "A and B" as a set intersection. Of course it's easy to draw a three-way correspondence between Boolean arithmetic, the events represented by the empty set and the whole space, and sets within some universe because all the objects are so simple, but these correspondences also generalize well to systems with more than two possible values. The ease of generalizing makes me think it's not just a matter of coincidence or convention that we have 0 <=> false.


You've just moved the point where we make the arbitrary choice to here:

> You can interpret "A or B" as a set union and "A and B" as a set intersection.

{True, False, Or, And} and {False, True, And, Or} are two different naming conventions for the exact same structure: the unique boolean algebra on two elements.


A union B is defined as the set of things that are in A or in B; A intersect B is defined as the set of things that are in A and in B. So I don't really see it as an arbitrary choice.


Also if you forget which is multiplication and which is addition, remember false is 0 and true is 1, then work it out from there: a * b is only nonzero if both of them are nonzero, so it's AND.


Isn't it x ^ y which is like x + y?


Yeah, | is technically x + y + x*y over GF(2) (this is the Boolean ring to Boolean algebra relation [1]). The GP is still a good way to remember the precedence though.

[1]: https://en.wikipedia.org/wiki/Boolean_ring#Relation_to_Boole...


^ is like + when operating over bits, since (bit)2 == 0

| is like + when operating over bools, since (bool)2 == 1


Yeah. To sum two bits x and y, the immediate sum is x XOR y, and the carry is x AND y. That's how a simple half-adder circuit can be made.


xor is more like a substraction...


Is there an intuition for this correspondence otherwise I don't think it's very helpful


There is:

The neutral element for + is 0 (x + 0 = x for any x).

The neutral element for * is 1 (x * 1 = x for any x).

Furthermore, you have arithmetic properties like x * 0 = 0 for any x (annulation) or (x + y) * z = (x * z) + (y * z) for any x, y, z (distributivity).

Similarly:

The neutral element for OR is false (x OR false = x for any x).

The neutral element for AND is true (x AND true = x for any x).

Furthermore, x AND false = false for any x, and (x OR y) AND z = (x AND z) OR (y AND z) for any x, y, z.

So OR works very much like + algebraically, and AND works very much like *.

When using 0 and 1 for false and true, AND is exactly the same as multiplication, and OR is like addition with saturation arithmetics (i.e. 1 + 1 = 1).

The common precedence rules stem from those parallels.


There's also a connection to probability: if you want the probability of A and B and C happening and they're independent, it's P(A) * P(B) * P(C). If you want the probability of A or B or C happening and they're mutually exclusive, it's P(A) + P(B) + P(C).

Similar analogue for set theory, as another commenter pointed out.


I've never heard of saturation arithmetic and now it all makes sense. Otherwise I thought it was more common to think of XOR as boolean addition, and OR would be represented as xy + x + y.


Yes, true and false with AND and XOR form a mathematical ring [0]. Still, OR is also an additive operation. IMO one could give OR and XOR the same precedence.

On the other hand, there is no strict need to have a dedicated boolean XOR operator, as it works the same as = (equals).

[0] https://en.wikipedia.org/wiki/Ring_(mathematics)


More than just a ring, it is the simplest finite field.


Saturation is actually the wrong way to think about bools when other operations get involved:

saturate(0 - 1) = 0

bool(0 - 1) = 1


Fun (but not particularly useful fact) is you can also represent x OR y as 1 - (1 - x)(1 - y). This shouldn't be too weird since x OR y is equal to NOT ((NOT x) AND (NOT y)).


I always remember it in terms of set operations. && corresponds to set intersection, while || corresponds to set union. The union operation is similar to "adding" two sets together. You also have the distributive property: a && (b || c) == (a && b) || (a && c).

The analogy isn't perfect, because || is also distributive over &&, but addition isn't distributive over multiplication. I think this is actually one of the essential properties that distinguishes a Boolean algebra from a ring. Someone with more knowledge of abstract algebra could probably provide more insight here, though.


Conjunction used to be called logical product, and disjunction logical sum, for the reason the other commenters pointed out.


a * b * c is only nonzero if all of a, b, c are nonzero. That's AND, and should be pretty intuitive.

a + b + c is nonzero if any of them are nonzero. (Remember each value is either 0 or 1.) So that's the intuition for OR.


a+b+c is nonzero if any of a or b or c are nonzero.


Yeah I just added that. I was hesitant initially cause you have to also note they're nonnegative, and that you're treating them like real numbers rather than mod 2.


Natural numbers are the natural numbers for me! (ISO 80000-2, of course)


IMO the intuition is to not use any intuition at all: there aren't built-in booleans in C, true is a #define for 1 and false is a #define for 0. For C conditionals, 0 = false, nonzero = true. So

a+b != 0 <=> a!=0 or b!=0

a*b != 0 <=> a!=0 and b!=0

Of course this intuition also reveals the pitfall behind this correspondence! You'd better make sure those are unsigned ints or #defined booleans, so you're not using general C expressions. 1 || -1 is true but 1 + (-1) is false.

Edit: forgot to mention: INT_MAX || 1 is true, but what about INT_MAX + 1 :)


Then what is _Bool, if not a built in boolean type?


I genuinely forgot stdbool.h existed, I don't actually write that much C myself.

My point was more around the conditionals being weakly typed around unsigned ints rather than a specific lack of built-ins. A lot of commenters were going into arithmetic mod 2 or philosophical issues, neither of which actually apply here.


_Bool isn't part of stdbool. It's a language keyword. C23 introduces full support for bool with true and false as keywords.


"true is a #define for 1" is bad idea. Because when `x` is, say, `2` then `if x` is not the same as `if x==true`.


That's just the way C is. If you want to check "truthiness" (as much as you can in C), just do 'if x'.


Something nice in C is that "if (x)" is always equivalent to "if (x != 0)". NULL is a macro for 0, and so is false. Boolean expressions evaluate to 0 if they don't hold. This isn't true in C++, though.


Kudos to the people that responded to explain why the correspondence is helpful.


Yes, sometimes logical AND called logical multiplication and logical OR called logical summation.

It seems clear for me, because I remember learning De Morgan's Laws in electronics class and from one specific level of Turing Complete game.


this is the notation used in the chapters about boolean algebra in my digital design course. I think it's pretty neat. I honestly never looked into operator precedence in any language enough to notice the relation.


No. Then I have to think about it.

I have (gently) jumped on peoples' cases for not using parentheses in expressions involving these operators, and will continue to do so, thanks.


Depends on the shell. Murex, for example, follows order of precedence correctly


0x8000 && 2 != 0x8000 * 2

-1 || 1 != -1 + 1

0x8000 || 0x8000 != 0x8000 + 0x8000

(with a 16-bit integer, adjust accordingly for larger word sizes)


> except in broken languages, of which the only notable one is shell

The array language people (APL, J, K) are going to come in and protest, but you aren't going to be able to understand them.

/s


Hey, even Dijkstra agrees with us on this one:

> I remember how much more pleasant the predicate calculus became to work with after we had decided to give con- and disjunction the same binding power and thus to consider p ∧ q ∨ r an ill-formed formula.

https://www.cs.utexas.edu/users/EWD/ewd13xx/EWD1300.PDF (page 4-5)


1+1=1? My maths education was a long time ago, but I triple checked with my calculator, and I'm uncertain this is quite right.

My own mnemonic: SAXO. Shift, And, Xor, Or. (Like a real Saxo, it's fun to go a bit faster like this, but you do have to trust everybody involved, because any accident is probably going to end up badly for you.)

And now you know the bitwise precedence as well! And this ordering actually works out tidily for common operations: "x=a<<sa|b<<sb|c<<sb"; "x=p>>n&m"; "x=x&~ma|a<<sa"; and so on. You do need brackets sometimes, but fewer than you'd think, and it helps the unusual cases stand out.

(Main annoying thing: a lot of compilers, even popular ones such as clang and gcc, don't actually seem to know what the precedence rules actually are, and generate warnings asking you to clarify. Presumably the authors didn't realise that C has an ISO standard, that can be consulted to answer this question? Very surprising.)


> (Main annoying thing: a lot of compilers, even popular ones such as clang and gcc, don't actually seem to know what the precedence rules actually are, and generate warnings asking you to clarify. Presumably the authors didn't realise that C has an ISO standard, that can be consulted to answer this question? Very surprising.)

The compilers do know what the precedence rules are, but they know that programmers don't routinely consult the ISO standard, so they emit those warnings to reduce the chance of error. Compilers are tools that are designed to help programmers avoid bugs. If they don't help programmers, they aren't doing their job.


Alternatively, they're encouraging programmers to be stupid and ignorant, causing a dumbing-down feedback loop which is ultimately damaging in the long term.

It ain't no surprise if you see the crap that passes for software these days and the nosedive in quality, but that's a rant for some other time...


> 1+1=1?

True + True = True, because "true" means "not zero" and "false" means "zero". In most all languages that permit int->bool casting, if(2) will evaluate to "true".

The warnings clang and FCC generate are a style warning because it's unclear on casual reading. Even readers who know the precedence rules will typically want to insert the parentheses manually. If the meaning were undefined, the compiler would give an error, not a warning.


It’s true as long as you don’t chain more than a couple billion Boolean checks.

More importantly, only if you don’t rely on short circuiting logic.


The analogy between logical AND and multiplication, and logical OR and addition is called Boolean Algebra, and it's very well known.

The analogy is that when you set 1 to true and 0 to false, 1 becomes the identity for AND, and 0 becomes the identity for OR. Just as 1 is the identity for multiplication and 0 is the identity for addition.

X * 0 = 0 | X ^ F = F

X * 1 = X | X ^ T = X

X + 0 = X | X V F = X

X + 1 = ? | X V T = T <-- This one breaks the analogy

With this, you can turn any logical expression into something that looks and feels like normal algebra, with the only weird exception being that both operators distribute over each other:

A * (B + C) = A*B + A*C

A + (B * C) = (A + B) * (A + C)


I suspect that most people here won’t get the joke. Renault pulled out of the US market 40 years ago, and was never more than a tiny player.


Better than that: the Saxo was from Citroen, who apparently haven't served North America since 1974 ;) The Renault equivalent would probably be the Clio.

(Not a big hot hatch connoisseur though, I must admit - I just remember the Saxo in particular as having a reputation of hitting a bad spot on the tradeoff graph for flimsiness/power/good sense of average member of target demographic.)


1+1 isn't zero, so it's one.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: