int-to-bool conversion issues


I encountered a really annoying issue, I am not sure it is Clang or C++ language issue.

But basically, the following code:

bool foo(bool a) {
return ~a;

Collapse into:

bool foo(bool a) {
return true;

The explanation is simple:

  • a is implicitly converted to int using zext
  • int operator~(int) has the unexpected behavior described below
  • then icmp ne 0 is used to convert to bool and is always true

Here is the truth table of bool-to-int + operator~(int) + int-to-bool:
false → 0b0000 → 0b0000 → true
true → 0b0001 → 0b1110 → true

Note that we can solve the issue by using sext for bool-to-int conversions.
Should we? Is this a known problem?

Sorry for the typo:

false → 0b0000 → 0b1111 → true
true → 0b0001 → 0b1110 → true

The C and C++ language standards fully specify this behavior. There’s no flexibility here to change it.

– Steve

But: you can use ! Instead of ~ to avoid this issue.

– Steve

Yes, you are right, I just double checked, it’s not allowed to pick any other choice.

The issue is that we are maintaining an arbitrary precision library. We provide a method that returns a range of bits, and a method that returns a single bit. The method that returns a range of bits, when used to pick a single bit, was behaving much better than the method that returns a single bit. The reason is the method that returns a single bit uses bool, while the other one stay in the realm of the library (aka. returns an arbitrary precision integer range).

It’s not the first time we see poor behavior from C++ native types, we should have expected it… (in general, we should move away from integral types smaller than int to avoid those pesky integer conversion/promotion)

FWIW, can you file a bug on this? I think at very least clang should have a warning for this.

Clang-trunk doesn’t seem to flag it but GCC 9.2 does. (