but instead of return 1 it returns 16777216. I have tested the program with
different values and it seems that it's a matter of internal binary
representation of integer values.
I show you the trace of calls to llvm that my compiler makes to generate the
code above. bloques.back().last is of type Value* and bloques.back().bl is
BasicBlock* (the current Block)
bloques.back().last =
dyn_cast<Value>(ConstantInt::get(Type::getInt32Ty(getGlobalContext()), $1,
true)); // $1 is an int
AllocaInst *alloc =
new AllocaInst(
Type::getInt32Ty(getGlobalContext()), izq.c_str(),
bloques.back().bl);
Value *derecha = bloques.back().last;
StoreInst *s = new StoreInst(derecha, alloc, false, bloques.back().bl);
bloques.back().last = alloc;
LoadInst* v1 =
new LoadInst(bloques.back().last, "r", false, bloques.back().bl);
bloques.back().last = v1;
BasicBlock* blockReturn =
BasicBlock::Create(getGlobalContext(), "Return", Main);
Value* last = bloques.back().last;
BranchInst::Create(blockReturn, bloques.back().bl);
ReturnInst::Create(getGlobalContext(), last, blockReturn);
The source code corresponding to the IR representation would be:
When the ExecutionEngine finishs the JIT execution it says:
Result: 16777216
There is nothing wrong with the IR code you initially posted. If you are
obtaining that result, it must be caused by other factor. My guess is
that you are messing up with the stack by calling the function the wrong
way or by obtaining the result from the JIT engine inappropriately.
Please show a test case that is complete and minimal and that
demonstrates your problem.
I have found that the problem is the endianness of Values. For example a 1 it's
stored as 00000001 00000000 00000000 0000000 instead of 0000000 00000000 0000000
00000000 so how can I change how the values are stored and/or load (because I
suppose that the problem is when I store it or load from memory)