Skip to content

Fix shape mismatch in Keras Attention layer during masking #21595

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

sonali-kumari1
Copy link
Contributor

This PR fixes the RuntimeError encountered in the Attention layer when using masks. The error arises from a shape mismatch between the mask and attention scores, resulting in a broadcasting issue during training.
Fixes: #21483

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Summary of Changes

Hello @sonali-kumari1, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request resolves a critical RuntimeError within the Keras Attention layer that occurred when using masks. The issue stemmed from an incompatibility in shape between the attention scores and the provided masks, which prevented proper broadcasting. The fix ensures that the mask's dimensions are correctly aligned with the attention scores, allowing the masking operation to proceed without errors and improving the robustness of the Attention layer.

Highlights

  • Keras Attention Layer Bug Fix: Addresses a RuntimeError encountered in the Keras Attention layer. This error was caused by a shape mismatch between the attention scores and the mask, leading to broadcasting issues during training.
  • Mask Dimension Handling: Introduces a conditional check within the _apply_scores method to ensure the padding_mask has the correct dimensions. If the mask is 2D, it is expanded to 3D by adding a new dimension at axis=-2, resolving the broadcasting conflict.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in issue comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request addresses a RuntimeError in the Keras Attention layer that occurs when using a 2D mask. The error is due to a shape mismatch between the 2D mask and the 3D attention scores, which prevents proper broadcasting. The proposed fix correctly identifies when the padding mask is 2D and expands its dimensions to make it compatible for broadcasting with the scores tensor. The change is concise, well-targeted, and effectively resolves the bug. The implementation is correct.

@codecov-commenter
Copy link

codecov-commenter commented Aug 18, 2025

Codecov Report

❌ Patch coverage is 50.00000% with 1 line in your changes missing coverage. Please review.
✅ Project coverage is 82.71%. Comparing base (df481e9) to head (613838f).
⚠️ Report is 58 commits behind head on master.

Files with missing lines Patch % Lines
keras/src/layers/attention/attention.py 50.00% 0 Missing and 1 partial ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master   #21595      +/-   ##
==========================================
- Coverage   82.84%   82.71%   -0.13%     
==========================================
  Files         565      568       +3     
  Lines       55656    56897    +1241     
  Branches     8685     8890     +205     
==========================================
+ Hits        46108    47063     +955     
- Misses       7433     7640     +207     
- Partials     2115     2194      +79     
Flag Coverage Δ
keras 82.52% <50.00%> (-0.14%) ⬇️
keras-jax 63.65% <50.00%> (+0.25%) ⬆️
keras-numpy 58.25% <50.00%> (-0.34%) ⬇️
keras-openvino 34.55% <0.00%> (+0.60%) ⬆️
keras-tensorflow 64.21% <50.00%> (+0.34%) ⬆️
keras-torch 63.81% <50.00%> (+0.34%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@gbaned gbaned requested a review from hertschuh August 18, 2025 07:23
@gbaned gbaned added this to PR Queue Aug 18, 2025
@github-project-automation github-project-automation bot moved this to Assigned Reviewer in PR Queue Aug 18, 2025
Copy link
Collaborator

@hertschuh hertschuh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the PR!

Can you add a unit test that exercises the case that was failing before this change?

Copy link
Collaborator

@fchollet fchollet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR! Please add a unit test.

@sonali-kumari1
Copy link
Contributor Author

Hi @fchollet @hertschuh -
I have added a unit test to verify shape mismatch with 2D mask. Thanks!

@@ -88,11 +88,13 @@ def test_attention_with_mask(self):

def test_attention_2D_mask_shape_mismatch(self):
layer = layers.Attention()
batch_size, Tq, Tv, dim = 2, 3, 3, 4
batch_size, Tq, Tv, dim = 2, 3, 4, 4
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry to be bother you again about this, but can you make all values unique? Like use 5 for dim.

Again, we want to make sure dimensions are matched up correctly in unit tests.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hertschuh, I updated the unit test to use distinct values for batch_size, Tq, Tv and dim to properly detect shape mismatches. Thank you!

Copy link
Collaborator

@hertschuh hertschuh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the fix!

@google-ml-butler google-ml-butler bot added kokoro:force-run ready to pull Ready to be merged into the codebase labels Aug 20, 2025
@github-project-automation github-project-automation bot moved this from Assigned Reviewer to Approved by Reviewer in PR Queue Aug 20, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
awaiting review ready to pull Ready to be merged into the codebase size:XS
Projects
Status: Approved by Reviewer
Development

Successfully merging this pull request may close these issues.

Attention layer issue broadcasting mask
6 participants