I am writing frontend using lldb+python and completely blocked on this.
I guess all Linux user/developers would be blocked if they are using trunk.
Somebody please fix this issue.
Is there a way to rework the lldb interpreter to read variables after a crash? Currently, lldb injects a variable to store the result of expression evaluation. One alternative is to use ptrace to handle a pure read on Linux...
Is there a way to rework the lldb interpreter to read variables after a crash?
Is this a problem because the expression parser can't allocate memory after you have crashed?
LLDB does like to place a copy of the result in the program memory, but it doesn't have to. We could change that.
Currently, lldb injects a variable to store the result of expression evaluation.
Do you mean injects memory into the inferior?
One alternative is to use ptrace to handle a pure read on Linux...
The IR interpreter will not run code for anything that it can handle (like memory and register reads). It might be that we are missing a common IR opcode that linux uses which forced JIT'ed code to run more often?
To verify: is your question regarding that fact that we can't allocate memory when crashed? Can't run code when crashed? What is the real issue?
IIRC, gdb can call functions and allocate memory (which it needs to pass strings to functions among other things) and the like on a crashed program on Linux. Been a while since I looked at how gdb works on Linux, but if gdb can do that, lldb should be able to as well.
Ø To verify: is your question regarding that fact that we can’t allocate memory when crashed? Can’t run code when crashed? What is the real issue?
The behavior may have changed in the last couple of weeks. With trunk, during PrepareToExecuteJITExpression, EntityVariable::Materialize calls AddressOf on the ValueObject for the variable being examined. This fails, and I’m currently trying to understand the failure by comparing the operation with the same expression evaluation just prior to the crash. Thanks for the helpful perspective on this thread. I’ll try to have a closer look early next week,