recommended workaround for distinguishing signed vs unsigned integers

Since llvm doesn’t distinguish signed vs unsigned integers anymore, what is the recommended way to represent a language that distinguishes them? Is that to introduce new types, eg:
%SignedI32 = type { i32 }

%UnsignedI32 = type { i32 }

?

Since llvm doesn't distinguish signed vs unsigned integers anymore, what
is the recommended way to represent a language that distinguishes them?

Distinguishes them how? The IR doesn't have to distinguish everything your
source language does.

If, for example, your language supports overloading based on type (and
unsigned and signed types are distinct types), much like Clang does for
C++, you would mangle those source-level types into the function name to
create distinct functions, regardless of the matching/non-matching nature
of the actual argument types.

Where else do you need to distinguish them?

That might be useful if your CPU distinguished them (which no current CPUs do – there are signed and unsigned operations, not signed and unsigned values).

It is irrelevant to your programming language. You do your type checking in your compiler, and emit LLVM code with i32 values the same as you’d emit machine code with 32 bit values that are neither signed nor unsigned.