Flex attention integration with block mask cache #2303
Annotations
4 errors
|
|
|
src/fairseq2/models/transformer/_sdpa/_flex.py#L17
'fairseq2.logging.log' imported but unused
|
|
|
|
src/fairseq2/models/transformer/_sdpa/_flex.py#L0
Imports are incorrectly sorted and/or formatted.
|
The logs for this run have expired and are no longer available.
Loading