3 questions
5
votes
0
answers
63
views
Make pytorch reserve certain amount of GPU memory upfront
My code has two workflows that run sequentially. In the first workflow, it requires only about 8GB of GPU memory, but the second workflow takes about 22GB. Is it possible to reserve 22GB upfront (even ...
1
vote
0
answers
491
views
How to control GPU memory in libtorch/CUDA on Windows
My problem is not related to the code, it is related to the "GPU memory" listed in the Windows task manager.
Briefly about the problem: I have an RTX4090 video card with 24GB of video memory....