Remove BIONEMO-2840 sm120 fused attention workarounds#1538
Open
pstjohn wants to merge 2 commits intoNVIDIA:mainfrom
Open
Remove BIONEMO-2840 sm120 fused attention workarounds#1538pstjohn wants to merge 2 commits intoNVIDIA:mainfrom
pstjohn wants to merge 2 commits intoNVIDIA:mainfrom
Conversation
Contributor
|
Important Review skippedAuto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the ⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro Run ID: You can disable this status message by setting the Use the checkbox below for a quick retry:
✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
Collaborator
Author
|
/ok to test 040eed7 |
The THD implementation for fused attention on sm120 (Blackwell) is now
available in Transformer Engine, so these workarounds are no longer needed.
Removes:
- pytest.xfail guards for sm120 in test_modeling_common.py (6 files)
- monkeypatch.setenv("NVTE_FUSED_ATTN", "0") blocks in esm2 recipe tests
- Unused monkeypatch parameters and torch import where applicable
Signed-off-by: svc-bionemo <267129667+svc-bionemo@users.noreply.github.com>
040eed7 to
bdeba12
Compare
Pre-commit (ruff format) flagged 7 functions with a blank line between the def signature and the first body line. Removed them. Signed-off-by: svc-bionemo <267129667+svc-bionemo@users.noreply.github.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
The THD implementation for fused attention on sm120 (Blackwell) is now available in Transformer Engine, so these workarounds are no longer needed.
Removes: