I’ve got the following questions about DBG_VALUE instruction:
- When I dump this instruction I see the line of the form:
DBG_VALUE %EDI, 0, !“a”; line no:47
for two completely different cases:
- when value of “a” is actually stored in a register %edi (this should be encoded as “DW_OP_reg5” in DWARF).
- when value of “a” is stored in memory at address stored in %edi (“DW_OP_breg5, 0” in DWARF).
Currently LLVM handles this in favor of first case, and produces wrong debug info for the second case. Can these cases
actually be distinguished, and if yes, where should I take a look?
- This one is more general. I’m trying to make “clang -g” work well with AddressSanitizer instrumentation enabled
(currently generated debug info for variables is pretty much inconsistent) and need to develop a workflow somehow.
ASan works with llvm IR, so there are no machine instructions, just llvm.dbg.declare / llvm.dbg.value intrinsics, which
are ignored by ASan. How can IR transforms (inserting function calls, basic blocks, etc.) hurt turning llvm.dbg intrinsic into
a set of DBG_VALUE instructions? What is the best way I can actually see what’s going on when we generate machine instructions from IR?