The LLVM optimizer runs into a dead loop when optimizing my module (I used instructions here to call the optimizer). I’m using LLVM 15.
I made the following investigations:
validateLLVMModule()report no errors on my module.
- If I serialize the module, then load the module and run the optimizer on the loaded module, the optimizer runs without issue. So it seems to me some in-memory representation has silently corrupted.
- If I dump the module before and after serialization and diff them, I see two difference. Before serialization, there are two intrinsic definitions in the module:
; Function Attrs: argmemonly mustprogress nocallback nofree nosync nounwind willreturn declare void @llvm.lifetime.start.p0(i64 immarg %0, ptr nocapture %1) #8 ; Function Attrs: argmemonly mustprogress nocallback nofree nosync nounwind willreturn declare void @llvm.lifetime.end.p0(i64 immarg %0, ptr nocapture %1) #8
but after serialization, the
mustprogress attribute of the two functions are lost.
4. The dead loop happens in
static bool lowerExpectIntrinsic(Function &F)). I don’t have a LLVM debug build so I haven’t yet investigated where exactly the dead loop happened. The module did not use ‘llvm.expect’.
What might have caused this problem? Can someone give me some tip on how I should debug this issue?