CUDA out of memory. Tried to allocate 24.00 MiB. GPU 0 has a total capacty of 21.99 GiB of which 7.06 MiB is free. Process 4089 has 492.00 MiB memory in use. Process 4086 has 492.00 MiB memory in use. Process 4110 has 492.00 MiB memory in use. Process 4096 has 492.00 MiB memory in use. Process 4111 has 492.00 MiB memory in use. Process 4108 has 492.00 MiB memory in use. Process 4122 has 3.12 GiB memory in use. Process 4084 has 990.00 MiB memory in use. Process 5240 has 994.00 MiB memory in use. Process 15620 has 2.56 GiB memory in use. Process 7394 has 2.87 GiB memory in use. Process 8312 has 2.87 GiB memory in use. Process 12745 has 988.00 MiB memory in use. Including non-PyTorch memory, this process has 1.04 GiB memory in use. Of the allocated memory 735.54 MiB is allocated by PyTorch, and 26.46 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF