Hi Juneyoung,
First of all, great job on your talk!
This is a question I guess you’d be the best person to answer but the rest of the LLVM community might want to participate.
I was thinking about a UB-related example that has been discussed by multiple people
(including you), all of them basically authors of this paper (https://www.cs.utah.edu/~regehr/papers/undef-pldi17.pdf):
– Before opt:
for (int i = 0; i < n; ++i) {
a[i] = x + 1;
}
– After opt (LICM):
int tmp = x + 1;
for (int i = 0; i < n; ++i) {
a[i] = tmp;
}
// Assume tmp
is never used again.
The reasoning here, is let’s make signed wrapping deferred UB that will only
occur if the value is used in one of X ways (e.g. as a denominator). To that end, if
n == 0 and x == INT_MAX, UB will never occur because the value is never used.
But, by doing that, the first point is:
If we translate this into machine code, the signed wrapping will happen, no matter
the value won’t be used.
Now, imagine that on some platform P, signed wrapping explodes the computer.
The computer will explode (should explode ? more on that later)
even if n == 0
, something that would not happen in the original code.
So, to justify this transformation as correct, implicitly, poison has
added definedness to signed wrapping: specifically, that the
computer won’t explode if SW happens. AFAIU, that is ok as far as C++ semantics
are concerned:
Since signed wrapping was UB, making it more defined is ok.
But that definedness now has created a burden to whoever is writing a back-end
from LLVM IR to P (the SW exploding platform).
That is, now, if they see a add <nsw>
, they can’t lower it to a trivial signed add,
since if they do that and x == INT_MAX, the computer will explode and that violates
the semantics of LLVM IR (since we mandated that SW doesn’t explode the machine).
Instead, they have to lower it as something like:
if (x == INT_MAX)
skip or whatever
Is this whole thinking correct ? UB, undef and poison all are very subtle so I’m trying
to wrap my head around.
Thanks,
Stefanos Baziotis