Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cuda Memory Error #39

Open
atulnagane45 opened this issue Mar 11, 2021 · 1 comment
Open

Cuda Memory Error #39

atulnagane45 opened this issue Mar 11, 2021 · 1 comment

Comments

@atulnagane45
Copy link

I am getting this error
RuntimeError: CUDA out of memory. Tried to allocate 648.00 MiB (GPU 0; 3.82 GiB total capacity; 1.57 GiB already allocated; 665.06 MiB free; 1.98 GiB reserved in total by PyTorch)

My hardware is Geforce GTX1650 4GB dedicated memory
I tried this on google colab too.

I already tried :
Batch size from 128 to 2
Cleared GPU cache.

I think it need to be fixed from code. Some variables unnecessarily taking too much space. Please help me.
Thanks in advance

@atulnagane45
Copy link
Author

Solved on google collab with batch size 2. What is minimum hardware requirement?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant