Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

memory usage #27

Open
s3xton opened this issue Feb 15, 2017 · 3 comments
Open

memory usage #27

s3xton opened this issue Feb 15, 2017 · 3 comments

Comments

@s3xton
Copy link

s3xton commented Feb 15, 2017

Is this model particularly memory intensive? The model hangs on building the loss model for seq_length 10 having used 16GB of RAM.

This might be caused by the fact that I am using tensorflow 1.0.0-rc1 (because of #26, legacy_seq2seq is present in 1.0.0-rc1 and not 0.12.1). I've modified your code to run on this version, and it trains correctly with lower length sequences, but still eats memory.

Is this normal, or some sort of memory leak caused by the newer version of tensorflow?

@henghuiz-zz
Copy link

I think the issue is this code saves one individual model for every possible sequence length. So it will take a huge amount of memory. I was trying to improve it but I then soon I find I don't need NTM and gave up.

@s3xton
Copy link
Author

s3xton commented Feb 15, 2017

Ah I see thank you. Do you have any suggestions as to how to approach solving this issue?

@camigord
Copy link

Hi @s3xton, check this implementation I have recently developed following Mostafa-Samir's approach.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants