We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug
source
keras 2.15
Yes
mac os
3.10
No response
https://github.com/keras-team/keras-io/blob/master/examples/nlp/neural_machine_translation_with_transformer.py line 362 attention_mask=padding_mask, if the sequence_length of encoder_inputs and decoder_inputs is different, the code will report an error. I think attention_mask should be set to padding_mask of encoder_inputs, not padding_mask of decoder_inputs.
https://keras.io/examples/nlp/neural_machine_translation_with_transformer/ https://github.com/keras-team/keras-io/blob/master/examples/nlp/neural_machine_translation_with_transformer.py
The text was updated successfully, but these errors were encountered:
sachinprasadhs
No branches or pull requests
Issue Type
Bug
Source
source
Keras Version
keras 2.15
Custom Code
Yes
OS Platform and Distribution
mac os
Python version
3.10
GPU model and memory
No response
Current Behavior?
https://github.com/keras-team/keras-io/blob/master/examples/nlp/neural_machine_translation_with_transformer.py
line 362 attention_mask=padding_mask,
if the sequence_length of encoder_inputs and decoder_inputs is different, the code will report an error.
I think attention_mask should be set to padding_mask of encoder_inputs, not padding_mask of decoder_inputs.
Standalone code to reproduce the issue or tutorial link
Relevant log output
No response
The text was updated successfully, but these errors were encountered: