Emulated TLS on x86_64 (Linux) with the JIT engine

Hi,

I'm trying to JIT-compile a part of an application at runtime using LLVM 3.9.1. The piece of code I want to JIT-compile accesses TLS variables and has been compiled offline into a bitcode file via clang 3.9.1. At runtime, that bitcode file gets loaded into a module. The module then gets passed to the following function:

void jit(llvm::Module *module) {
  TargetOptions opts;
  opts.EmulatedTLS = 1;
  string err_str;
  ExecutionEngine *engine = EngineBuilder(unique_ptr<llvm::Module>(module))
    .setTargetOptions(opts)
    .setErrorStr(&err_str)
    .setEngineKind(EngineKind::JIT)
    .setMCJITMemoryManager(std::unique_ptr<SectionMemoryManager>(new SectionMemoryManager()))
    .setOptLevel(CodeGenOpt::Level::Aggressive)
    .create();
  engine->finalizeObject();
  ...
}

I already figured out that TLS on x86_64 is not supported and crashes with a corresponding LLVM error. So I switched to emulated TLS. However, this results in a new LLVM error "LLVM ERROR: Program used external function '__emutls_v.xyz_' which could not be resolved!" where xyz is one of the TLS variables. If I'm correct, all those __emutls_v variables should get replaced by invokations of __emutls_get_address().

Am I missing something? Is emulated TLS even supposed to work with the JIT enigne on x86_64?

Thanks in advance,
Bastian

I am in a simillar situation in regards to Windows, but Linux & OS X TLS seems to work.

Is there any reason or example you have of TLS not working in Linux?