Undefine behaviour. Can one explain me this code.

int main()
int a=012;
int b=05;
printf("%d Welcome %d",a,b);

You are using the value that you are incrementing in the same sequence point. C and C++ does not define whether (a < a++) is the same as a < a, a++, or a++, a < a, or some other variation on that theme (and no, it doesn’t help if you do (a < b++ || b < a++), it’s still undefined what order the ++ and greater than happens in.

It is only OK if you do them separately, in different sequence points (if sequence point is a mystery to you, google it, as my explanation will not be as good as what you find in google). The only thing that IS defined is that a is one more AFTER the sequence point.

This is just one example of such undefined behaviour. Both C and C++ are languages that have a lot of “undefined behaviour”, where it is up to the implementor of the compiler and the hardware exactly what happens (including completely unexpected behaviour, such as the compiler starts a game, turning off the power, rebooting or the generated code crashes for things that you don’t expect to crash). Included in “undefined behaviour” is “it works as you expect”, but it also covers everything else that computer is capable of possibly doing. [Of course, compiler people are typically kind (or at least not evil) and try their best effort to make the undefined behaviour as “nice” and close to what you would expect as possibly can, as long as it doesn’t harm performance for defined behaviour]