Skip to content

[PyTorch] Fix FlashAttention 2 head_dim > 192 on sm103 and other architectures #15773

[PyTorch] Fix FlashAttention 2 head_dim > 192 on sm103 and other architectures

[PyTorch] Fix FlashAttention 2 head_dim > 192 on sm103 and other architectures #15773

This workflow is awaiting approval from a maintainer in #2836
Triggered via pull request April 4, 2026 19:08
Status Action required
Total duration
Artifacts
This workflow is awaiting approval from a maintainer in #2836

docs.yml

on: pull_request
Build
Build
Fit to window
Zoom out
Zoom in