Skip to content

Add TLX attention (WS pipelined pingpong hopper) #320

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

yf225
Copy link
Contributor

@yf225 yf225 commented Jul 31, 2025

No description provided.

@yf225 yf225 force-pushed the tlx_attn_hopper branch from 9e4ffe6 to 6d5389b Compare July 31, 2025 04:29
@yf225 yf225 temporarily deployed to docker-s3-upload July 31, 2025 04:30 — with GitHub Actions Inactive
@yf225 yf225 temporarily deployed to docker-s3-upload July 31, 2025 04:30 — with GitHub Actions Inactive
Copy link
Contributor

@njriasan njriasan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See my comment but otherwise this looks good. Thanks!

@@ -299,6 +311,16 @@ def triton_tutorial_flash_v2_tma(
q, k, v, self.causal, self.sm_scale, "tma"
)

@register_benchmark(enabled=HAS_TLX)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the intention to run on CI? If you just want to trigger this test directly you can do with enabled=False and without any risk of it running on hardware you don't expect (e.g. MI300), which it will by default right now.


import triton
import triton.language as tl
import triton.language.extra.tlx as tlx
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm assuming we have tested this doesn't crash without tlx because of the guards in the operator.py file.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants