CUDA out of memory. Tried to allocate 128.00 MiB. GPU 0 has a total capacty of 21.99 GiB of which 127.06 MiB is free. Process 4089 has 492.00 MiB memory in use. Process 4086 has 492.00 MiB memory in use. Process 4110 has 492.00 MiB memory in use. Process 4096 has 492.00 MiB memory in use. Process 4111 has 492.00 MiB memory in use. Process 4108 has 492.00 MiB memory in use. Process 4122 has 3.12 GiB memory in use. Including non-PyTorch memory, this process has 2.83 GiB memory in use. Process 7394 has 3.06 GiB memory in use. Process 8312 has 2.87 GiB memory in use. Process 12745 has 988.00 MiB memory in use. Process 23086 has 2.09 GiB memory in use. Process 24680 has 2.40 GiB memory in use. Process 2598 has 988.00 MiB memory in use. Process 11560 has 616.00 MiB memory in use. Of the allocated memory 2.36 GiB is allocated by PyTorch, and 158.33 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF