Today, InstSimplify calls computeKnownBits to try to fold to a constant ONLY if it can’t find any other simplifications for an instruction.
I think this means if we are able to find an earlier simplification we are LESS aggressive about creating a constant because we won’t even look at known bits. This seems surprising to me.
Should we be looking at known bits always, or should we remove this save the compile time? How often does this allow us to simplify to a constant?
~Craig
Today, InstSimplify calls computeKnownBits to try to fold to a constant
ONLY if it can't find any other simplifications for an instruction.
I think this means if we are able to find an earlier simplification we are
LESS aggressive about creating a constant because we won't even look at
known bits.
Not necessarily.
It may create more or less constants right now, not just strictly less.
InstSimplify is not complete in it's transformations, so in some cases, the
symbolic form is going to result in more simplified operations than the
non-symbolic form.
Now, how often does this happen?
Again, hard to say. I would guess infrequently 
But it's definitely the case that because it is not complete, it is not a
100% improvement all the time to try to make as many constants as possible.
If it *was* complete, there would be for sure.
This seems surprising to me.
Should we be looking at known bits always, or should we remove this save
the compile time? How often does this allow us to simplify to a constant?
It may be cheaper, compile time wise , to compute known bits non-lazily and
look at it always than trying to use it always.
As for how often it allows us to simplify to a constant, dunno, we should
just track it.
Add a statistic, see how it does 
I wasn’t trying to say anything about the number of constants with by “less”. Just about the effort we put in to try to make constants. For example if you give instsimplify “add %x, 0” it will return %x and not do anything with computing known bits and not create a constan. If you give it “add %x, 1” and assuming we can’t do some distributive simplication or reassociation etc. it will fall back to computeKnownBits and might create a constant.
I wasn't trying to say anything about the number of constants with by
"less".
Right, but the point still stands that it's not really an always-better
game.
We could make it one, but it's definitely not right now 
Just about the effort we put in to try to make constants. For example if
you give instsimplify "add %x, 0" it will return %x and not do anything
with computing known bits and not create a constan. If you give it "add
%x, 1" and assuming we can't do some distributive simplication or
reassociation etc. it will fall back to computeKnownBits and might create a
constant.
True.
But it also has depth limits/etc in those things, and we end up better off
if everything look the same.
So if it causes more things to look "not the same" by doing this, we lose.
I still think it's worth exploring, of course, it's just not necessarily an
"always-win" right now.