How to clear GPU memory on Ubuntu

To clear GPU memory on Ubuntu with an NVIDIA GPU (like your RTX 4090), the method depends on whether you can safely stop the processes using the GPU. Below are the recommended ways, from safest to most forceful.


1. Check what is using GPU memory

You already have this:

nvidia-smi

GPU processes consuming memory:

425352  python
442205  python
445172  python
451372  python

2. Safely kill the processes using GPU memory

If these processes are safe to stop (your cell2loc_env3.9 Python jobs):

Kill a single GPU process

sudo kill -9 <PID>

Example:

sudo kill -9 451372

Kill all Python GPU processes

Be careful — this will kill all python jobs using GPU.

sudo pkill -9 python

After killing processes, GPU memory should drop:

nvidia-smi

3. Kill all users’ GPU jobs (only yours)

killall -u $USER python

4. Restart the GPU Persistence Daemon (safe)

sudo systemctl restart nvidia-persistenced

5. Reset the GPU (forceful, interrupts ALL GPU work!)

This resets the GPU without rebooting the OS.

sudo nvidia-smi --gpu-reset -i 0

If it says “cannot reset while processes exist”, kill them first.


⚠️ 6. Full GPU driver reset requires stopping Xorg

If your GPU is also used for your display (as shown in your nvidia-smi: Xorg & gnome-shell), GPU reset may fail.

To reset GPU without reboot:

Switch to TTY:

Press:

Ctrl + Alt + F3

Stop the display manager:

For Ubuntu GNOME:

sudo systemctl stop gdm

Then reset GPU:

sudo nvidia-smi --gpu-reset -i 0

Then restart GNOME:

sudo systemctl start gdm

🧹 7. Last resort: Reboot

sudo reboot

This always clears GPU memory.


✔️ Recommended for your case

Your GPU memory shows 20+ GB used by cell2loc_env3.9 Python processes:

425352 
442205
445172
451372

Run:

sudo kill -9 451372 445172 442205 425352

Then verify:

nvidia-smi

If you want, tell me which processes should stay and I can give exact commands.

Leave a Reply