define intrinsic function with pointer-typed parameter


If I define a intrinsic function with pointer-typed parameter, for example,
def llvm_foo_ptr_ty : LLVMPointerType<llvm_i16_ty>;

def int_foo_get : Intrinsic<[llvm_foo_ptr_ty], [llvm_foo_ptr_ty, llvm_i32_ty], [IntrReadArgMem]>;

How to lower it for the backend? I’m not sure what kind of register (i16 or i32 or i32) is needed in this case? If the parameter is LLVMPointerType<llvm_i32_ty> instead of LLVMPointerType<llvm_i16_ty>, will this make difference for the backend? Suppose my backend has three types of register: i16, i32, i64.

When I check the debug information, I can see LLVM tries to use i32 to lower the parameter to build SDAG. But why it chooses i32 instead of i64?

Any input is appreciable.



It sounds like you’re hitting the default case in SelectionDAGBuilder::visitIntrinsicCall and exercising the visitTargetIntrinsic code path. I haven’t dug into the default handling there, but that’s probably what you need to look at. Depending on your intrinsic, you might also consider adding custom handling in visitIntrinsicCall itself.

This sounds like your target isn’t telling the canonicalization logic i16 is a valid register type or that the cost of i32 register is much less than an i16. However, I’m no expert in the backend stuff, so take my comment with a grain of salt.