Im trying to run inference using ONNX runtime on my server GPU. However im getting this error:
2024-08-10 23:53:29.404983674 [E:onnxruntime:Default, provider_bridge_ort.cc:1745 TryGetProviderInfo_CUDA] /onnxruntime_src/onnxruntime/core/session/provider_bridge_ort.cc:1426 onnxruntime::Provider& onnxruntime::ProviderLibrary::Get() [ONNXRuntimeError] : 1 : FAIL : Failed to load library libonnxruntime_providers_cuda.so with error: libcublasLt.so.11: cannot open shared object file: No such file or directory
After looking into my server cuda files, I noticed that i have libcublasLt.so.12 instead of 11. I have the latest versions of ONNX-runtime and ONNX-runtime gpu installed.
How do i fix this? I dont have admin permissions on the server so cant really downgrade cuda or something of that matter.