Skip to content

Conversation

xuzhao9
Copy link
Contributor

@xuzhao9 xuzhao9 commented Sep 24, 2025

Upstream pytorch has CUDA 13.0 support

@xuzhao9
Copy link
Contributor Author

xuzhao9 commented Sep 26, 2025

We have to hold the upgrade, because flash_attention does not support CUDA 13.0 yet.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant