Skip to content

feat: add OpenCode as an AI provider#1685

Open
bjcoombs wants to merge 7 commits intoeyaltoledano:mainfrom
bjcoombs:1684-opencode-provider
Open

feat: add OpenCode as an AI provider#1685
bjcoombs wants to merge 7 commits intoeyaltoledano:mainfrom
bjcoombs:1684-opencode-provider

Conversation

@bjcoombs
Copy link
Copy Markdown
Contributor

@bjcoombs bjcoombs commented Apr 14, 2026

Summary

Adds OpenCode as a new CLI-backed AI provider, mirroring the existing claude-code, codex-cli, and gemini-cli integrations. Task Master delegates authentication, model routing, and provider selection to a running OpenCode server via ai-sdk-provider-opencode-sdk (same community author as the other CLI-provider deps), staying agnostic about the underlying LLM.

Closes the provider half of #965 (which was closed against the rules-profile work in #970) and addresses #1684.

Motivation

Users whose organisations permit OpenCode - including OpenCode's GitHub Copilot backend - but prohibit direct LLM API credentials currently cannot use Task Master at all. Every existing provider either needs a direct LLM API key, or is tied to a specific CLI tool that may not be approved. Delegating to a locally-running OpenCode server sidesteps that entirely: Task Master stays out of the authentication path.

End-to-end verified against a real OpenCode + GitHub Copilot setup - parse-prd produces valid task trees via github-copilot/gpt-4.1 and github-copilot/gpt-5.2.

Changes

  • src/ai-providers/opencode.js - New OpencodeProvider wrapping ai-sdk-provider-opencode-sdk@^0.0.3. Synchronous getClient matching the other CLI providers' contract. Includes a JSDoc troubleshooting block for the two common upstream SDK failure modes (see "Known caveats" below).
  • src/ai-providers/index.js - Export the new provider class.
  • scripts/modules/ai-services-unified.js - Register under key opencode in the PROVIDERS map.
  • scripts/modules/config-manager.js - Add opencode section to DEFAULTS, merge in _loadAndValidateConfig, expose getOpencodeSettings / getOpencodeSettingsForCommand, add OpenCode to the keyless-provider short-circuit in getMcpApiKeyStatus and to hasCodebaseAnalysis / providersWithoutApiKeys / isApiKeySet so it's consistently treated as a no-API-key provider.
  • scripts/modules/commands.js - Add --opencode flag to task-master models alongside the other CLI-provider flags, wire it into the provider-hint cascades for --set-main / --set-research / --set-fallback.
  • scripts/modules/task-manager/models.js - Add OPENCODE case to setModel's providerHint handling; accepts unknown model IDs with a warning since OpenCode's model catalog is dynamic.
  • packages/tm-core/src/common/constants/providers.ts - Add OPENCODE: 'opencode' to CUSTOM_PROVIDERS.
  • scripts/modules/supported-models.json - Seed with 7 verified model IDs that exist in a default OpenCode install (Anthropic direct + GitHub Copilot path). IDs cross-checked against opencode models output so the starter list doesn't ship with unreachable entries.
  • tests/unit/ai-providers/opencode.test.js - 5 new tests mirroring codex-cli.test.js: construction, API-key-not-required, client creation via config-manager, case-insensitive model-support check, error wrapping on missing CLI.
  • tests/unit/ai-providers/ai-services-unified.test.js / tests/unit/config-manager.test.js - Mock updates to accommodate the new provider.
  • .changeset/add-opencode-provider.md - User-facing changeset (minor bump).

Testing

  • npm test tests/unit/ai-providers/ - 229 tests pass across 18 suites including the 5 new OpencodeProvider tests.
  • npm run turbo:typecheck - clean across all 9 packages.
  • biome format - passes.
  • Live end-to-end: task-master parse-prd generated a valid 3-task tree via opencode provider with github-copilot/gpt-4.1 backend. Proves the full plumbing: Task Master → OpencodeProvider → ai-sdk-provider-opencode-sdk → OpenCode server → GitHub Copilot.

Risk Assessment

  • Low, additive only. No existing provider code paths touched functionally. OpenCode is opt-in via task-master models --setup or --opencode flag.
  • Dependency: ai-sdk-provider-opencode-sdk@^0.0.3 (the ai-sdk-v5 tag line, matching the repo's ai@^5.0.51). Same community maintainer as the existing ai-sdk-provider-claude-code, -codex-cli, -gemini-cli deps. v0.0.3 specifically includes the zod 3/4 compatibility fix (Fix zod 4 incompatibility in loggerSchema (closes #17) ben-vargas/ai-sdk-provider-opencode-sdk#18 - landed during this PR's review) required to even import the SDK in this project's zod-4 tree.
  • Runtime requirement: OpenCode must be installed and configured. validateAuth() performs a lightweight opencode --version probe and surfaces install guidance if missing, matching the pattern in codex-cli.js.

Known caveats

Two failure modes that surface from upstream @opencode-ai/sdk and aren't fixable in this repo:

  1. Invalid model ID (e.g., typing github-copilot/gpt-5 instead of gpt-5.2) produces Unexpected end of JSON input because the underlying client blindly calls response.json() on OpenCode's empty error body. Mitigation: seeded model list is verified against a real OpenCode install; the provider's JSDoc directs users to opencode models if they hit this.
  2. Reasoning models with long responses can trip TypeError: terminated when OpenCode's default HTTP timeout expires mid-response. Mitigation: JSDoc directs users at the serverTimeout setting in .taskmaster/config.json's opencode section.

Both are tracked upstream at ben-vargas/ai-sdk-provider-opencode-sdk#21. The SDK maintainer has indicated the ai-sdk-v5 tag line is frozen, so a deeper fix would require either a task-master-side fork of the SDK or a migration of the wider task-master codebase to AI SDK v6 - both out of scope for this PR.

Deployment Notes

  • No config migration required. Existing Task Master installations are unaffected unless a user explicitly selects OpenCode.
  • task-master models --setup picks up the new provider automatically via the existing supported-models-driven flow.

Test plan

  • Unit tests pass locally
  • Typecheck clean
  • Format check clean
  • task-master models --setup surfaces OpenCode and accepts a model selection
  • task-master parse-prd succeeds with OpenCode selected as main model (verified with github-copilot/gpt-4.1)

@changeset-bot
Copy link
Copy Markdown

changeset-bot bot commented Apr 14, 2026

🦋 Changeset detected

Latest commit: af93188

The changes in this PR will be included in the next version bump.

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Apr 14, 2026

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review
📝 Walkthrough

Walkthrough

Adds OpenCode as a new AI provider: provider implementation and export, provider registration, config getters and defaults, supported-models entries, CLI flag and task-manager integration, dependency addition, changeset, and unit tests. (≤50 words)

Changes

Cohort / File(s) Summary
Provider Implementation & Export
src/ai-providers/opencode.js, src/ai-providers/index.js
New OpencodeProvider class (client creation, CLI availability check, model utilities); exported from ai-providers index.
Provider Constants & Registry
packages/tm-core/src/common/constants/providers.ts, scripts/modules/ai-services-unified.js
Added OPENCODE: 'opencode' and registered opencode in the static PROVIDERS lookup.
Configuration Management
scripts/modules/config-manager.js
Added opencode defaults, merged parsed config.opencode, new getters getOpencodeSettings() / getOpencodeSettingsForCommand(), and treated OpenCode as not requiring an API key.
Model Definitions
scripts/modules/supported-models.json
Added top-level opencode group with seven model entries (ids/names, roles, max_tokens, swe_score, zero-cost token entries).
CLI/Commands & Task Manager Integration
scripts/modules/commands.js, scripts/modules/task-manager/models.js
Added --opencode flag and validation; provider-hint selection and setModel() now handle opencode and lookup curated models by provider.
Dependency & Release Metadata
package.json, .changeset/add-opencode-provider.md
Added dependency ai-sdk-provider-opencode-sdk@^0.0.3 and a changeset bump documenting the provider addition (minor).
Tests
tests/unit/ai-providers/opencode.test.js, tests/unit/ai-services-unified.test.js, tests/unit/config-manager.test.js
New unit tests for OpencodeProvider, updated mocks and config fixtures to assert settings merging, client creation, API-key behavior, model matching, and ENOENT handling.

Sequence Diagram(s)

sequenceDiagram
  participant User
  participant Provider as OpencodeProvider
  participant Config as ConfigManager
  participant SDK as ai-sdk-provider-opencode-sdk
  participant CLI as Opencode CLI/Server

  User->>Provider: request client / invoke model (commandName)
  Provider->>Config: getOpencodeSettingsForCommand(commandName)
  Config-->>Provider: merged opencode settings
  Provider->>SDK: createOpencode(settings)
  SDK->>CLI: connect (hostname/port) / route requests
  SDK-->>Provider: client instance
  Provider-->>User: return client or throw enhanced error (if CLI missing)
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Possibly related issues

Possibly related PRs

Suggested reviewers

  • Crunchyman-ralph
  • eyaltoledano
🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 50.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Title check ✅ Passed The title 'feat: add OpenCode as an AI provider' accurately and concisely describes the main change: introduction of a new AI provider (OpenCode) to the system.
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Collaborator

@Crunchyman-ralph Crunchyman-ralph left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm, just fix Ci and let's set main as the base branch since dev is a work in progress at the moment

Comment thread scripts/modules/config-manager.js
Comment thread package-lock.json Outdated
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
scripts/modules/config-manager.js (1)

887-948: ⚠️ Potential issue | 🟡 Minor

Add opencode to providers without API keys in isApiKeySet.

The isApiKeySet function has a providersWithoutApiKeys array (lines 891-898) that includes CODEX_CLI, GEMINI_CLI, GROK_CLI, etc. Since OpencodeProvider.isRequiredApiKey() returns false (OpenCode manages its own credentials), you should add CUSTOM_PROVIDERS.OPENCODE to this list for consistency.

Similarly, check if getMcpApiKeyStatus (around line 957) needs a case for 'opencode' to return true.

🛡️ Proposed fix
 const providersWithoutApiKeys = [
 	CUSTOM_PROVIDERS.OLLAMA,
 	CUSTOM_PROVIDERS.BEDROCK,
 	CUSTOM_PROVIDERS.GEMINI_CLI,
 	CUSTOM_PROVIDERS.GROK_CLI,
 	CUSTOM_PROVIDERS.MCP,
-	CUSTOM_PROVIDERS.CODEX_CLI
+	CUSTOM_PROVIDERS.CODEX_CLI,
+	CUSTOM_PROVIDERS.OPENCODE
 ];

And in getMcpApiKeyStatus switch statement (around line 1029):

 			case 'codex-cli':
 				return true; // OAuth/subscription via Codex CLI
+			case 'opencode':
+				return true; // OpenCode manages its own credentials
 			case 'mistral':

As per coding guidelines: "For providers not requiring an API key (like Ollama), add a specific check at the beginning of isApiKeySet and getMcpApiKeyStatus to return true immediately for that provider."

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/modules/config-manager.js` around lines 887 - 948, Add
CUSTOM_PROVIDERS.OPENCODE to the providersWithoutApiKeys list in isApiKeySet so
Opencode is treated as not requiring an API key
(OpencodeProvider.isRequiredApiKey() is false), and update getMcpApiKeyStatus to
include a case for 'opencode' returning true so the MCP-specific check also
treats Opencode as OK; modify the arrays/switch that reference provider names
(providersWithoutApiKeys in isApiKeySet and the switch in getMcpApiKeyStatus) to
handle the 'opencode' lowercase key consistently.
🧹 Nitpick comments (1)
src/ai-providers/opencode.js (1)

26-127: Align module shape with the provider-module contract used in this repo.

This file currently exports only a provider class. The guideline for src/ai-providers/*.js requires provider-specific generate<ProviderName>Text, stream<ProviderName>Text, and generate<ProviderName>Object wrappers. Please add wrapper exports (can delegate to this class) or explicitly reconcile this architecture in the rule set.

As per coding guidelines: src/ai-providers/*.js: “Create a new provider module in src/ai-providers/<provider-name>.js that implements generate<ProviderName>Text, stream<ProviderName>Text, and generate<ProviderName>Object functions using the Vercel AI SDK”.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/ai-providers/opencode.js` around lines 26 - 127, The module only exports
OpencodeProvider but must also expose provider-module contract functions; add
and export wrapper functions generateOpencodeText, streamOpencodeText, and
generateOpencodeObject that delegate to OpencodeProvider (or its getClient) so
the rest of the repo can call the Vercel-style helpers; locate the
OpencodeProvider class and its getClient method and implement wrappers that
create/obtain a client via getClient and call into the provider instance to
perform text generation, streaming, and object generation semantics, then export
these three functions alongside OpencodeProvider.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/ai-providers/opencode.js`:
- Around line 99-105: The current detection treats any error whose message
matches /opencode/i as "OpenCode not available", which can mislabel unrelated
init failures; update the conditional in the OpenCode init path (the variables
code, msg, error and the branch that creates enhancedError and calls
this.handleError) to use a stricter check: keep code === 'ENOENT' but replace
the loose /opencode/i test with a more specific condition such as checking
error.code === 'MODULE_NOT_FOUND' or testing that error.message contains both
"opencode" as a whole word and phrases like "not found" or "Cannot find module";
ensure you only create the enhancedError and call this.handleError('OpenCode
initialization', enhancedError) when the narrower detection confirms the CLI is
truly missing, otherwise let the original error propagate to the existing error
handling path.
- Around line 30-33: Guard against non-array returns from
getSupportedModelsForProvider('opencode') in the constructor by ensuring
this.supportedModels is always an array: check the result of
getSupportedModelsForProvider('opencode') and if it's not an Array assign an
empty array (or coerce via Array.isArray / Array.from fallback) before using
this.supportedModels.length and before any logic that assumes array methods;
update the constructor code where this.supportedModels is set so references to
supportedModels, the constructor itself, and getSupportedModelsForProvider are
handled safely.

---

Outside diff comments:
In `@scripts/modules/config-manager.js`:
- Around line 887-948: Add CUSTOM_PROVIDERS.OPENCODE to the
providersWithoutApiKeys list in isApiKeySet so Opencode is treated as not
requiring an API key (OpencodeProvider.isRequiredApiKey() is false), and update
getMcpApiKeyStatus to include a case for 'opencode' returning true so the
MCP-specific check also treats Opencode as OK; modify the arrays/switch that
reference provider names (providersWithoutApiKeys in isApiKeySet and the switch
in getMcpApiKeyStatus) to handle the 'opencode' lowercase key consistently.

---

Nitpick comments:
In `@src/ai-providers/opencode.js`:
- Around line 26-127: The module only exports OpencodeProvider but must also
expose provider-module contract functions; add and export wrapper functions
generateOpencodeText, streamOpencodeText, and generateOpencodeObject that
delegate to OpencodeProvider (or its getClient) so the rest of the repo can call
the Vercel-style helpers; locate the OpencodeProvider class and its getClient
method and implement wrappers that create/obtain a client via getClient and call
into the provider instance to perform text generation, streaming, and object
generation semantics, then export these three functions alongside
OpencodeProvider.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 60207bc1-8fba-4fac-880a-a0c441a2feaa

📥 Commits

Reviewing files that changed from the base of the PR and between 487c3d3 and 20287e3.

⛔ Files ignored due to path filters (1)
  • package-lock.json is excluded by !**/package-lock.json
📒 Files selected for processing (9)
  • .changeset/add-opencode-provider.md
  • package.json
  • packages/tm-core/src/common/constants/providers.ts
  • scripts/modules/ai-services-unified.js
  • scripts/modules/config-manager.js
  • scripts/modules/supported-models.json
  • src/ai-providers/index.js
  • src/ai-providers/opencode.js
  • tests/unit/ai-providers/opencode.test.js

Comment thread src/ai-providers/opencode.js Outdated
Comment thread src/ai-providers/opencode.js Outdated
@bjcoombs bjcoombs force-pushed the 1684-opencode-provider branch from 20287e3 to f555101 Compare April 14, 2026 14:32
Comment thread scripts/modules/config-manager.js
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (1)
src/ai-providers/opencode.js (1)

99-105: ⚠️ Potential issue | 🟠 Major

Narrow missing-CLI detection to avoid masking real init failures.

The /opencode/i branch is too broad and can misclassify unrelated errors as “OpenCode not available.”

🔧 Suggested patch
-			if (code === 'ENOENT' || /opencode/i.test(msg)) {
+			const missingCli =
+				code === 'ENOENT' ||
+				/\b(opencode(\.exe)?\b).*(not found|is not recognized|cannot find module)/i.test(
+					msg
+				);
+			if (missingCli) {
 				const enhancedError = new Error(
 					`OpenCode not available. Install it from: https://opencode.ai - Original error: ${error.message}`
 				);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/ai-providers/opencode.js` around lines 99 - 105, The current conditional
in the OpenCode init branch (around the code that creates enhancedError and
calls this.handleError) uses /opencode/i which is too broad and may misclassify
unrelated errors; tighten the detection by only treating errors as "OpenCode not
available" when the error is an ENOENT OR when the error indicates the opencode
binary specifically (e.g., check error.path === 'opencode' or match
error.message against a stricter pattern like /\bopencode\b.*\bnot
found\b|\bcommand not found\b/i or inspect error.syscall === 'spawn' &&
error.path === 'opencode'); update the if condition that currently reads if
(code === 'ENOENT' || /opencode/i.test(msg)) to use one of these stricter checks
so only genuine missing-CLI cases create enhancedError and call
this.handleError('OpenCode initialization', enhancedError).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@scripts/modules/config-manager.js`:
- Around line 65-66: Update the key-checking logic so the new provider
"opencode" is treated as keyless: in isApiKeySet(...) add an early-return true
when provider === 'opencode' (like the existing Ollama check), and do the same
at the start of getMcpApiKeyStatus(...) to immediately return true for
'opencode'; also update any providersWithoutApiKeys list/logic that enumerates
keyless providers to include 'opencode' so status and setup flows no longer mark
it as misconfigured (refer to the isApiKeySet and getMcpApiKeyStatus function
names and the opencode entry in the config object).

---

Duplicate comments:
In `@src/ai-providers/opencode.js`:
- Around line 99-105: The current conditional in the OpenCode init branch
(around the code that creates enhancedError and calls this.handleError) uses
/opencode/i which is too broad and may misclassify unrelated errors; tighten the
detection by only treating errors as "OpenCode not available" when the error is
an ENOENT OR when the error indicates the opencode binary specifically (e.g.,
check error.path === 'opencode' or match error.message against a stricter
pattern like /\bopencode\b.*\bnot found\b|\bcommand not found\b/i or inspect
error.syscall === 'spawn' && error.path === 'opencode'); update the if condition
that currently reads if (code === 'ENOENT' || /opencode/i.test(msg)) to use one
of these stricter checks so only genuine missing-CLI cases create enhancedError
and call this.handleError('OpenCode initialization', enhancedError).
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 769b2274-b2d0-410e-965e-5878f69f0f2f

📥 Commits

Reviewing files that changed from the base of the PR and between 20287e3 and f555101.

⛔ Files ignored due to path filters (1)
  • package-lock.json is excluded by !**/package-lock.json
📒 Files selected for processing (9)
  • .changeset/add-opencode-provider.md
  • package.json
  • packages/tm-core/src/common/constants/providers.ts
  • scripts/modules/ai-services-unified.js
  • scripts/modules/config-manager.js
  • scripts/modules/supported-models.json
  • src/ai-providers/index.js
  • src/ai-providers/opencode.js
  • tests/unit/ai-providers/opencode.test.js
✅ Files skipped from review due to trivial changes (4)
  • src/ai-providers/index.js
  • .changeset/add-opencode-provider.md
  • packages/tm-core/src/common/constants/providers.ts
  • scripts/modules/supported-models.json
🚧 Files skipped from review as they are similar to previous changes (2)
  • tests/unit/ai-providers/opencode.test.js
  • package.json

Comment thread scripts/modules/config-manager.js
@bjcoombs bjcoombs changed the base branch from next to main April 14, 2026 14:49
@bjcoombs bjcoombs dismissed Crunchyman-ralph’s stale review April 14, 2026 14:49

The base branch was changed.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (1)
scripts/modules/config-manager.js (1)

959-972: ⚠️ Potential issue | 🟠 Major

Return keyless provider status before filesystem checks in getMcpApiKeyStatus.

getMcpApiKeyStatus('opencode', ...) can still return false early (Line 960-Line 972) when project root or .cursor/mcp.json is missing, because the keyless case 'opencode' is reached too late.

🔧 Proposed fix
 function getMcpApiKeyStatus(providerName, projectRoot = null) {
+	const provider = providerName?.toLowerCase();
+	if (
+		provider === CUSTOM_PROVIDERS.OLLAMA ||
+		provider === CUSTOM_PROVIDERS.CLAUDE_CODE ||
+		provider === CUSTOM_PROVIDERS.CODEX_CLI ||
+		provider === CUSTOM_PROVIDERS.GEMINI_CLI ||
+		provider === CUSTOM_PROVIDERS.GROK_CLI ||
+		provider === CUSTOM_PROVIDERS.MCP ||
+		provider === CUSTOM_PROVIDERS.BEDROCK ||
+		provider === CUSTOM_PROVIDERS.OPENCODE
+	) {
+		return true;
+	}
+
 	const rootDir = projectRoot || findProjectRoot(); // Use existing root finding
 	if (!rootDir) {
 		console.warn(
 			chalk.yellow('Warning: Could not find project root to check mcp.json.')
 		);
 		return false; // Cannot check without root
 	}
@@
-		switch (providerName) {
+		switch (provider) {

As per coding guidelines: “For providers not requiring an API key (like Ollama), add a specific check at the beginning of isApiKeySet and getMcpApiKeyStatus to return true immediately for that provider”.

Also applies to: 988-1033

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/modules/config-manager.js` around lines 959 - 972, Add an early check
at the top of getMcpApiKeyStatus (and mirror in isApiKeySet) to immediately
return true for keyless providers (e.g., providerName === 'opencode' or other
keyless providers like 'ollama') before any project root or filesystem checks;
locate getMcpApiKeyStatus and isApiKeySet and insert the providerName ===
'opencode' condition as the first branch so the function does not proceed to
findProjectRoot() or check .cursor/mcp.json for these providers.
🧹 Nitpick comments (1)
scripts/modules/config-manager.js (1)

892-900: Avoid duplicating keyless-provider lists in two places.

providersWithoutApiKeys is defined both inside isApiKeySet and again as an exported constant. This will drift over time. Prefer one module-level source of truth reused by both functions.

Also applies to: 1270-1278

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/modules/config-manager.js` around lines 892 - 900, The file defines
providersWithoutApiKeys in two places; remove the duplicate local array inside
isApiKeySet and instead reference the single module-level exported constant
(providersWithoutApiKeys) so there is one source of truth; update isApiKeySet to
import/use that exported constant and likewise replace the duplicate at the
other location (the array used around lines 1270-1278) so both places reuse the
same module-level providersWithoutApiKeys identifier (verify any export remains
named providersWithoutApiKeys and adjust references accordingly).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/ai-providers/opencode.js`:
- Around line 131-134: The isModelSupported(modelId) check lowercases only the
input but not the entries in this.supportedModels, causing case-mismatch bugs;
fix by normalizing both sides—either precompute and store a normalized set/array
of supported model IDs (e.g., lowercased values) when initializing the provider,
or change isModelSupported to compare String(modelId).toLowerCase() against
this.supportedModels.map(s => String(s).toLowerCase()) (or a cached Set of
lowercased values) so comparisons are case-insensitive; update references to the
normalized collection accordingly and ensure null/undefined guarded behavior
remains.

---

Duplicate comments:
In `@scripts/modules/config-manager.js`:
- Around line 959-972: Add an early check at the top of getMcpApiKeyStatus (and
mirror in isApiKeySet) to immediately return true for keyless providers (e.g.,
providerName === 'opencode' or other keyless providers like 'ollama') before any
project root or filesystem checks; locate getMcpApiKeyStatus and isApiKeySet and
insert the providerName === 'opencode' condition as the first branch so the
function does not proceed to findProjectRoot() or check .cursor/mcp.json for
these providers.

---

Nitpick comments:
In `@scripts/modules/config-manager.js`:
- Around line 892-900: The file defines providersWithoutApiKeys in two places;
remove the duplicate local array inside isApiKeySet and instead reference the
single module-level exported constant (providersWithoutApiKeys) so there is one
source of truth; update isApiKeySet to import/use that exported constant and
likewise replace the duplicate at the other location (the array used around
lines 1270-1278) so both places reuse the same module-level
providersWithoutApiKeys identifier (verify any export remains named
providersWithoutApiKeys and adjust references accordingly).
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 6424ef73-7c26-4c93-a65f-051d6acc6997

📥 Commits

Reviewing files that changed from the base of the PR and between f555101 and 8134b82.

📒 Files selected for processing (4)
  • scripts/modules/config-manager.js
  • src/ai-providers/opencode.js
  • tests/unit/ai-services-unified.test.js
  • tests/unit/config-manager.test.js

Comment thread src/ai-providers/opencode.js
@bjcoombs
Copy link
Copy Markdown
Contributor Author

Thanks for the fast review @Crunchyman-ralph. Changes made since your approval:

Base branch changed to `main` as requested.

Test failures fixed: the `ai-services-unified.test.js` mock was missing `OpencodeProvider` (exports), and `config-manager.test.js` fixtures needed the new `opencode` section. Both resolved.

All 5 review threads resolved (cursor + CodeRabbit). Specifically:

  • OpenCode added to `providersWithoutApiKeys`, `isApiKeySet`, `getMcpApiKeyStatus`, and `hasCodebaseAnalysis` — matches the pattern for other CLI-backed providers
  • `supportedModels` now guards against non-array return values
  • CLI-missing error detection narrowed to avoid masking unrelated init failures

Two additional fixes from end-to-end testing on my fork:

  1. `--opencode` CLI flag added. Was missing from `task-master models`. Now:
    ```bash
    task-master models --set-main anthropic/claude-sonnet-4-5 --opencode
    ```

  2. SDK is lazy-loaded inside `getClient()`. Context: `ai-sdk-provider-opencode-sdk@0.0.2` (the `ai-sdk-v5` tag line) is currently incompatible with zod 4 — it calls `z.function().args()` at module-load time, which throws `TypeError: z.function(...).args is not a function` in any project with zod 4 in its tree (which this project has, via `@ai-sdk/*`).

    Filed upstream: ben-vargas/ai-sdk-provider-opencode-sdk#17.

    The lazy-load ensures task-master users who never select OpenCode are not crashed at CLI startup. When the SDK is fixed upstream and we bump the pin, the feature becomes usable end-to-end. Until then, selecting OpenCode and invoking an AI call will surface a clear error pointing to the SDK bug.

Happy to hold merge until the SDK lands a fix if you'd prefer the feature to be functional on day one. Alternatively, merging now is also fine — the provider registration, tests, and UX are all in place, and a single version bump of the SDK will activate it.

Comment thread src/ai-providers/opencode.js
Adds OpenCode as a new CLI-backed provider, mirroring the existing
claude-code, codex-cli, and gemini-cli integrations. Task Master
delegates authentication, model routing, and provider selection to a
running OpenCode server via ai-sdk-provider-opencode-sdk, staying
agnostic about the underlying LLM.

Unblocks users whose organisations permit OpenCode (including its
GitHub Copilot backend) but prohibit direct LLM API credentials - the
original ask in eyaltoledano#965 that was closed against the rules-profile work in
eyaltoledano#970 rather than the provider request itself.

- Add OpencodeProvider wrapping ai-sdk-provider-opencode-sdk
- Register in provider index, ai-services-unified, and CUSTOM_PROVIDERS
- Seed supported-models.json with representative OpenCode model IDs
  (provider/model format) across Anthropic, OpenAI, Copilot, Gemini
- Add opencode settings section to config-manager with command-specific
  override support, matching the pattern used by other CLI providers
- Unit tests mirroring codex-cli.test.js coverage
- Include OPENCODE in providersWithoutApiKeys, isApiKeySet check,
  getMcpApiKeyStatus switch, and hasCodebaseAnalysis - OpenCode is a
  keyless CLI-backed provider and should match the pattern used by
  claude-code, gemini-cli, grok-cli, and codex-cli
- Guard OpencodeProvider.supportedModels against non-array config
  return values
- Narrow CLI-missing error detection so real SDK/config errors aren't
  relabelled as install issues
- Add OpencodeProvider mock to ai-services-unified test to restore
  test suite green
- Add opencode section to config-manager test fixture
- Add --opencode flag to `task-master models` and wire into provider-hint
  cascades for --set-main, --set-research, --set-fallback
- Lazy-load ai-sdk-provider-opencode-sdk inside OpencodeProvider.getClient()
  so module-load-time failures in the SDK (e.g., the zod 4 incompatibility
  tracked at ben-vargas/ai-sdk-provider-opencode-sdk#17) do not crash
  task-master for users who never select opencode
- Update opencode.test.js to reflect async getClient signature
… hint

- ben-vargas/ai-sdk-provider-opencode-sdk#18 released as v0.0.3 fixes
  the zod 4 incompat. Bump the dep so OpenCode selection actually works
  end-to-end
- Add OPENCODE case to setModel's providerHint cascade in
  scripts/modules/task-manager/models.js so --opencode is accepted
@bjcoombs bjcoombs force-pushed the 1684-opencode-provider branch from e7eaaa3 to c17fff6 Compare April 14, 2026 15:34
- getClient is synchronous again, matching the contract of other
  providers (ClaudeCodeProvider, CodexCliProvider). The lazy-load
  workaround for the zod 4 SDK bug is no longer needed now that
  ai-sdk-provider-opencode-sdk v0.0.3 is pinned (cursor, HIGH).
- isModelSupported normalises both sides of the comparison so
  configured model IDs with mixed case are matched correctly
  (coderabbitai).
- getMcpApiKeyStatus short-circuits for keyless providers (opencode,
  ollama, bedrock, claude-code, codex-cli, gemini-cli, grok-cli, mcp)
  before touching the filesystem - matches the guideline that these
  should not depend on .cursor/mcp.json existence (coderabbitai).
- Run biome format to satisfy CI format check.
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
scripts/modules/commands.js (1)

3693-3703: ⚠️ Potential issue | 🟠 Major

Provider-flag resolution is still incomplete (--azure and --vertex are omitted).

--azure/--vertex are valid options, but they’re excluded from both conflict counting and providerHint resolution. This can accept conflicting flags and also fail to pass the intended provider hint to setModel().

💡 Suggested fix
-			const providerFlags = [
-				options.openrouter,
-				options.ollama,
-				options.bedrock,
-				options.claudeCode,
-				options.geminiCli,
-				options.codexCli,
-				options.opencode,
-				options.lmstudio,
-				options.openaiCompatible
-			].filter(Boolean).length;
-			if (providerFlags > 1) {
+			const selectedProviderHints = [
+				['openrouter', options.openrouter],
+				['ollama', options.ollama],
+				['bedrock', options.bedrock],
+				['claude-code', options.claudeCode],
+				['azure', options.azure],
+				['vertex', options.vertex],
+				['gemini-cli', options.geminiCli],
+				['codex-cli', options.codexCli],
+				['opencode', options.opencode],
+				['lmstudio', options.lmstudio],
+				['openai-compatible', options.openaiCompatible]
+			]
+				.filter(([, enabled]) => enabled)
+				.map(([hint]) => hint);
+
+			if (selectedProviderHints.length > 1) {
 				console.error(
 					chalk.red(
-						'Error: Cannot use multiple provider flags (--openrouter, --ollama, --bedrock, --claude-code, --gemini-cli, --codex-cli, --opencode, --lmstudio, --openai-compatible) simultaneously.'
+						'Error: Cannot use multiple provider flags (--openrouter, --ollama, --bedrock, --claude-code, --azure, --vertex, --gemini-cli, --codex-cli, --opencode, --lmstudio, --openai-compatible) simultaneously.'
 					)
 				);
 				process.exit(1);
 			}
+			const providerHint = selectedProviderHints[0];
@@
-						providerHint: options.openrouter
-							? 'openrouter'
-							: options.ollama
-								? 'ollama'
-								: options.bedrock
-									? 'bedrock'
-									: options.claudeCode
-										? 'claude-code'
-										: options.geminiCli
-											? 'gemini-cli'
-											: options.codexCli
-												? 'codex-cli'
-												: options.opencode
-													? 'opencode'
-													: options.lmstudio
-													? 'lmstudio'
-													: options.openaiCompatible
-														? 'openai-compatible'
-														: undefined,
+						providerHint,
@@
-						providerHint: options.openrouter
-							? 'openrouter'
-							: options.ollama
-								? 'ollama'
-								: options.bedrock
-									? 'bedrock'
-									: options.claudeCode
-										? 'claude-code'
-										: options.geminiCli
-											? 'gemini-cli'
-											: options.codexCli
-												? 'codex-cli'
-												: options.opencode
-													? 'opencode'
-													: options.lmstudio
-													? 'lmstudio'
-													: options.openaiCompatible
-														? 'openai-compatible'
-														: undefined,
+						providerHint,
@@
-						providerHint: options.openrouter
-							? 'openrouter'
-							: options.ollama
-								? 'ollama'
-								: options.bedrock
-									? 'bedrock'
-									: options.claudeCode
-										? 'claude-code'
-										: options.geminiCli
-											? 'gemini-cli'
-											: options.codexCli
-												? 'codex-cli'
-												: options.opencode
-													? 'opencode'
-													: options.lmstudio
-													? 'lmstudio'
-													: options.openaiCompatible
-														? 'openai-compatible'
-														: undefined,
+						providerHint,

Also applies to: 3743-3761, 3778-3797, 3815-3833

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/modules/commands.js` around lines 3693 - 3703, The provider-flag
counting and provider hint logic omit --azure and --vertex, causing incorrect
conflict detection and wrong providerHint passed into setModel(); update every
providerFlags (and related providerHint resolution) array that currently lists
options.openrouter, options.ollama, options.bedrock, options.claudeCode,
options.geminiCli, options.codexCli, options.opencode, options.lmstudio,
options.openaiCompatible to also include options.azure and options.vertex so the
conflict count and the computed providerHint include those flags and ensure the
providerHint passed into setModel() reflects azure/vertex when specified.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Outside diff comments:
In `@scripts/modules/commands.js`:
- Around line 3693-3703: The provider-flag counting and provider hint logic omit
--azure and --vertex, causing incorrect conflict detection and wrong
providerHint passed into setModel(); update every providerFlags (and related
providerHint resolution) array that currently lists options.openrouter,
options.ollama, options.bedrock, options.claudeCode, options.geminiCli,
options.codexCli, options.opencode, options.lmstudio, options.openaiCompatible
to also include options.azure and options.vertex so the conflict count and the
computed providerHint include those flags and ensure the providerHint passed
into setModel() reflects azure/vertex when specified.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: a385f24b-771d-4f25-842f-cf1554b02dbd

📥 Commits

Reviewing files that changed from the base of the PR and between 8134b82 and c17fff6.

⛔ Files ignored due to path filters (1)
  • package-lock.json is excluded by !**/package-lock.json
📒 Files selected for processing (13)
  • .changeset/add-opencode-provider.md
  • package.json
  • packages/tm-core/src/common/constants/providers.ts
  • scripts/modules/ai-services-unified.js
  • scripts/modules/commands.js
  • scripts/modules/config-manager.js
  • scripts/modules/supported-models.json
  • scripts/modules/task-manager/models.js
  • src/ai-providers/index.js
  • src/ai-providers/opencode.js
  • tests/unit/ai-providers/opencode.test.js
  • tests/unit/ai-services-unified.test.js
  • tests/unit/config-manager.test.js
✅ Files skipped from review due to trivial changes (4)
  • src/ai-providers/index.js
  • package.json
  • packages/tm-core/src/common/constants/providers.ts
  • .changeset/add-opencode-provider.md
🚧 Files skipped from review as they are similar to previous changes (5)
  • scripts/modules/ai-services-unified.js
  • tests/unit/config-manager.test.js
  • tests/unit/ai-providers/opencode.test.js
  • scripts/modules/supported-models.json
  • scripts/modules/config-manager.js

Copy link
Copy Markdown

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

Fix All in Cursor

❌ Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.

Reviewed by Cursor Bugbot for commit abddd71. Configure here.

Comment thread scripts/modules/config-manager.js Outdated
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

♻️ Duplicate comments (1)
src/ai-providers/opencode.js (1)

102-107: ⚠️ Potential issue | 🟠 Major

Tighten missing-CLI detection to avoid masking real initialization errors.

Line 104 still treats any "command failed: opencode" as “CLI missing,” which can hide real runtime/config failures when the CLI is installed but exits non-zero.

Proposed fix
-			const missingCli =
-				code === 'ENOENT' ||
-				/spawn opencode ENOENT|opencode: command not found|command failed: opencode/i.test(
-					msg
-				);
+			const missingCli =
+				code === 'ENOENT' ||
+				/\bspawn\s+opencode\s+ENOENT\b/i.test(msg) ||
+				/\bopencode(?:\.exe)?:\s*(?:command not found|not found|is not recognized)\b/i.test(
+					msg
+				);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/ai-providers/opencode.js` around lines 102 - 107, The current missingCli
detection (variable missingCli checking code and msg) is too broad because it
treats any "command failed: opencode" as a missing-CLI case; update the check to
only detect true "not installed" signals: keep code === 'ENOENT', match explicit
patterns like /\b(opencode: command not found|spawn opencode ENOENT)\b/i and/or
check for exit code 127 if available, and remove the generic /command failed:
opencode/i alternative so real initialization errors from a present-but-failing
opencode binary are not masked by the missingCli branch.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Duplicate comments:
In `@src/ai-providers/opencode.js`:
- Around line 102-107: The current missingCli detection (variable missingCli
checking code and msg) is too broad because it treats any "command failed:
opencode" as a missing-CLI case; update the check to only detect true "not
installed" signals: keep code === 'ENOENT', match explicit patterns like
/\b(opencode: command not found|spawn opencode ENOENT)\b/i and/or check for exit
code 127 if available, and remove the generic /command failed: opencode/i
alternative so real initialization errors from a present-but-failing opencode
binary are not masked by the missingCli branch.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 781a0826-41af-4189-a9cd-05e1e7ce5c67

📥 Commits

Reviewing files that changed from the base of the PR and between c17fff6 and abddd71.

📒 Files selected for processing (4)
  • scripts/modules/commands.js
  • scripts/modules/config-manager.js
  • src/ai-providers/opencode.js
  • tests/unit/ai-providers/opencode.test.js
✅ Files skipped from review due to trivial changes (2)
  • scripts/modules/commands.js
  • tests/unit/ai-providers/opencode.test.js
🚧 Files skipped from review as they are similar to previous changes (1)
  • scripts/modules/config-manager.js

The early short-circuit block added in this PR already returns true
for CUSTOM_PROVIDERS.OPENCODE before the switch is reached, making
the case dead code (cursor).
…ng doc

Replaces three invalid/unreachable model IDs in the seeded OpenCode
supported-models list:
- openai/gpt-5.2 -> github-copilot/gpt-5.2 (the openai/* prefix needs
  direct OpenAI auth which most OpenCode users don't set up; the
  Copilot path is the common one)
- openai/gpt-5.2-codex -> github-copilot/gpt-5.2-codex (same reason)
- github-copilot/gpt-5 -> github-copilot/gpt-4.1 (gpt-5 isn't a real
  Copilot model name; the family is gpt-5.2 etc. Using gpt-4.1 here
  as the proven-working default; gpt-5.2 is seeded separately.)
- google/gemini-2.5-pro -> github-copilot/gemini-2.5-pro (no bare
  google/ path in default OpenCode model registry without direct
  Google auth)

Also add a troubleshooting JSDoc block to the provider explaining the
two common failure modes ("Unexpected end of JSON input" on invalid
model IDs; "terminated" on slow reasoning models) and how to diagnose
them with `opencode models` and the `serverTimeout` setting.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants