Well, I am much happier now that I understand about dsymutil, and can actually step through my program in gdb. However, there are still some issues that are puzzling me.
- First off, the debugger appears to stop at odd points. The IR for my main function looks correct to me:
define i32 @“main(tart.core.Array[tart.core.String])->int”(%“tart.core.Array[tart.core.String]”* %args) {
entry:
call void @llvm.dbg.func.start({ }* bitcast (%llvm.dbg.subprogram.type* @llvm.dbg.subprogram to { }))
call void @llvm.dbg.stoppoint(i32 6, i32 22, { } bitcast (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { }))
%testModuleReflection = call { } @testModuleReflection() ; <{ }> [#uses=0]
call void @llvm.dbg.stoppoint(i32 7, i32 19, { } bitcast (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { }))
%testModuleMethods = call { } @testModuleMethods() ; <{ }> [#uses=0]
call void @llvm.dbg.stoppoint(i32 8, i32 16, { } bitcast (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { }))
%testFindMethod = call { } @testFindMethod() ; <{ }> [#uses=0]
call void @llvm.dbg.stoppoint(i32 9, i32 10, { } bitcast (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { }))
call void @llvm.dbg.region.end({ } bitcast (%llvm.dbg.subprogram.type* @llvm.dbg.subprogram to { }*))
ret i32 0
}
However, when I single step into the function, it stops at the the second stop point. In fact, it appears to do this fairly consistently with all functions - the very first statement is always skipped.
Here’s the original source (with line numbers) for reference:
1 import tart.reflect.Module;
2 import tart.reflect.Method;
3
4 @EntryPoint
5 def main(args:String) → int {
6 testModuleReflection();
7 testModuleMethods();
8 testFindMethod();
9 return 0;
10 }
-
Another weird thing is that I can’t seem to declare function variables that are not lvalues. The DIFactory::InsertDeclare method seems to require that the Storage parameter be the result of an alloca. However, what about function arguments that are passed by value, such as ints?
-
The same issue holds for immutable local variables. My language supports the concept of a “assign once” variable (like ‘final’ in Java). for which I use SSA values directly rather than storage created via alloca(). Does this means that there is no way to debug such variables?
-
There seems to be something weird going on with DW_TAG_inheritance: When I print out the type in the debugger I see <> symbols:
2 = {
<> = {
<> = {
__tib = 0x0
},
members of tart.reflect.Member:
_name = 0x0,
_fullName = 0x0,
_kind = 0,… etc …
I’d like to see an example of DW_TAG_inheritance in the doc.
- I can’t seem to get the structure offsets to work. Here’s what my generated IR for a struct member looks like (reformatted somewhat):
@llvm.dbg.derivedtype104 = internal constant %llvm.dbg.derivedtype.type {
i32 458765,
{ }* bitcast (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { }),
i8 getelementptr inbounds ([10 x i8]* @.str103, i32 0, i32 0),
{ }* bitcast (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { }),
i32 27,
i64 mul (i64 ptrtoint (%1** getelementptr (%1** null, i32 1) to i64), i64 8),
i32 mul (i32 ptrtoint (%1** getelementptr (%38 null, i32 0, i32 1) to i32), i32 8),
i64 mul (i64 ptrtoint (%1** getelementptr (%tart.reflect.Member* null, i64 0, i32 2) to i64), i64 8),
i32 0,
{ }* bitcast (%llvm.dbg.derivedtype.type* @llvm.dbg.derivedtype102 to { }) },
section “llvm.metadata” ; <%llvm.dbg.derivedtype.type> [#uses=1]
You can see the offset calculation on line 9. However, here’s what dwarfdump reports for the entry:
0x00001dcd: member [10]
name( “_fullName” )
type( {0x00001bbb} ( tart.core.String* ) )
decl file( “/Users/talin/Projects/tart/trunk/stdlib/tart/reflect/Module.tart” )
decl line( 27 )
data member location( +0 ) ← huh?
-
It might be good to mention in the source-level debugging docs that line and column numbers are 1-based, not 0-based. I know that might seem obvious but it threw me off for a bit.
-
Another thing that confused me for a while is that the debugging APIs all want size, offset, and alignment values to be in bits, not bytes - but ConstantExpr::getSizeOf() et al return a size in bytes (which makse sense, since it behaves like C sizeof). More confusing, however, the comments for getSizeOf() doesn’t say what units its result is in - I automatically assumed that getSizeOf and DIFactory were compatible. Probably the simplest thing would be to add a “getSizeOfInBits” method (and the same for align and offset) which could be used directly with DIFactory.
Note: All of the above results were produced with the current 2.6 branch head.
– Talin