CUDA out of memory. Tried to allocate 128.00 MiB. GPU 0 has a total capacty of 21.99 GiB of which 55.06 MiB is free. Process 4089 has 492.00 MiB memory in use. Process 4086 has 492.00 MiB memory in use. Process 4110 has 492.00 MiB memory in use. Process 4096 has 492.00 MiB memory in use. Process 4111 has 492.00 MiB memory in use. Process 4108 has 492.00 MiB memory in use. Process 4103 has 988.00 MiB memory in use. Process 4122 has 1.96 GiB memory in use. Process 4092 has 990.00 MiB memory in use. Process 4084 has 990.00 MiB memory in use. Process 4090 has 3.67 GiB memory in use. Process 4125 has 2.75 GiB memory in use. Including non-PyTorch memory, this process has 2.75 GiB memory in use. Process 4100 has 2.37 GiB memory in use. Process 8950 has 2.57 GiB memory in use. Of the allocated memory 2.34 GiB is allocated by PyTorch, and 104.33 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF