Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added Support for Returning Attention Scores in TransformerEncoder call #1879

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

anirudhr20
Copy link

Summary: This pull request introduces a new feature that adds support for optionally returning attention scores in the TransformerEncoder class. This is controlled by the return_attention_scores flag, which when set to True, returns both the output and the attention scores from the attention mechanism.

Changes Introduced:

  • Updated the call method in the TransformerEncoder to handle the return_attention_scores flag.
  • Refactored the code to ensure that the attention scores are computed and returned when required.
  • Updated the documentation in the call method to reflect the changes.
  • Added unit tests to ensure the correctness of this feature using pytest.

Testing:
Ran the unit tests to verify that the return_attention_scores flag works as expected.
return_attention_scores=True (verifies that attention scores are returned and the shapes are correct).
Related Issue: #1644

Copy link

google-cla bot commented Sep 25, 2024

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant