Skip to content

Remove BIONEMO-2840 sm120 fused attention workarounds#1538

Open
pstjohn wants to merge 2 commits intoNVIDIA:mainfrom
svc-bionemo:pstjohn/remove-bionemo-2840-workarounds
Open

Remove BIONEMO-2840 sm120 fused attention workarounds#1538
pstjohn wants to merge 2 commits intoNVIDIA:mainfrom
svc-bionemo:pstjohn/remove-bionemo-2840-workarounds

Conversation

@pstjohn
Copy link
Copy Markdown
Collaborator

@pstjohn pstjohn commented Mar 30, 2026

The THD implementation for fused attention on sm120 (Blackwell) is now available in Transformer Engine, so these workarounds are no longer needed.

Removes:

  • pytest.xfail guards for sm120 in test_modeling_common.py (6 files)
  • monkeypatch.setenv("NVTE_FUSED_ATTN", "0") blocks in esm2 recipe tests
  • Unused monkeypatch parameters and torch import where applicable

@copy-pr-bot
Copy link
Copy Markdown

copy-pr-bot bot commented Mar 30, 2026

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Mar 30, 2026

Important

Review skipped

Auto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 2796c0d4-5346-4ccc-8bfe-1dd39a73b762

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Use the checkbox below for a quick retry:

  • 🔍 Trigger review
✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Comment @coderabbitai help to get the list of available commands and usage tips.

@pstjohn
Copy link
Copy Markdown
Collaborator Author

pstjohn commented Mar 30, 2026

/ok to test 040eed7

The THD implementation for fused attention on sm120 (Blackwell) is now
available in Transformer Engine, so these workarounds are no longer needed.

Removes:
- pytest.xfail guards for sm120 in test_modeling_common.py (6 files)
- monkeypatch.setenv("NVTE_FUSED_ATTN", "0") blocks in esm2 recipe tests
- Unused monkeypatch parameters and torch import where applicable

Signed-off-by: svc-bionemo <267129667+svc-bionemo@users.noreply.github.com>
@svc-bionemo svc-bionemo force-pushed the pstjohn/remove-bionemo-2840-workarounds branch from 040eed7 to bdeba12 Compare April 1, 2026 18:30
Pre-commit (ruff format) flagged 7 functions with a blank line between
the def signature and the first body line. Removed them.

Signed-off-by: svc-bionemo <267129667+svc-bionemo@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants