-
Notifications
You must be signed in to change notification settings - Fork 11
Open
Description
Thanks for you fascinating job!
I'm trying to run the giving train code on 4060ti 16GB GPU. But it ends up with "CUDA out of memory". I wonder how large will the memory be enough to run the train.py code? I noticed that in #13 it was mentioned that it took a little GPU memory. or is there some magic that can optimize the GPU memory use?
Thank you!
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels