You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is this model particularly memory intensive? The model hangs on building the loss model for seq_length 10 having used 16GB of RAM.
This might be caused by the fact that I am using tensorflow 1.0.0-rc1 (because of #26, legacy_seq2seq is present in 1.0.0-rc1 and not 0.12.1). I've modified your code to run on this version, and it trains correctly with lower length sequences, but still eats memory.
Is this normal, or some sort of memory leak caused by the newer version of tensorflow?
The text was updated successfully, but these errors were encountered:
I think the issue is this code saves one individual model for every possible sequence length. So it will take a huge amount of memory. I was trying to improve it but I then soon I find I don't need NTM and gave up.
Is this model particularly memory intensive? The model hangs on building the loss model for seq_length 10 having used 16GB of RAM.
This might be caused by the fact that I am using tensorflow 1.0.0-rc1 (because of #26, legacy_seq2seq is present in 1.0.0-rc1 and not 0.12.1). I've modified your code to run on this version, and it trains correctly with lower length sequences, but still eats memory.
Is this normal, or some sort of memory leak caused by the newer version of tensorflow?
The text was updated successfully, but these errors were encountered: