feat: add OpenCode as an AI provider#1685
feat: add OpenCode as an AI provider#1685bjcoombs wants to merge 7 commits intoeyaltoledano:mainfrom
Conversation
🦋 Changeset detectedLatest commit: af93188 The changes in this PR will be included in the next version bump. Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughAdds OpenCode as a new AI provider: provider implementation and export, provider registration, config getters and defaults, supported-models entries, CLI flag and task-manager integration, dependency addition, changeset, and unit tests. (≤50 words) Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant Provider as OpencodeProvider
participant Config as ConfigManager
participant SDK as ai-sdk-provider-opencode-sdk
participant CLI as Opencode CLI/Server
User->>Provider: request client / invoke model (commandName)
Provider->>Config: getOpencodeSettingsForCommand(commandName)
Config-->>Provider: merged opencode settings
Provider->>SDK: createOpencode(settings)
SDK->>CLI: connect (hostname/port) / route requests
SDK-->>Provider: client instance
Provider-->>User: return client or throw enhanced error (if CLI missing)
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Possibly related issues
Possibly related PRs
Suggested reviewers
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Crunchyman-ralph
left a comment
There was a problem hiding this comment.
lgtm, just fix Ci and let's set main as the base branch since dev is a work in progress at the moment
There was a problem hiding this comment.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
scripts/modules/config-manager.js (1)
887-948:⚠️ Potential issue | 🟡 MinorAdd
opencodeto providers without API keys inisApiKeySet.The
isApiKeySetfunction has aprovidersWithoutApiKeysarray (lines 891-898) that includesCODEX_CLI,GEMINI_CLI,GROK_CLI, etc. SinceOpencodeProvider.isRequiredApiKey()returnsfalse(OpenCode manages its own credentials), you should addCUSTOM_PROVIDERS.OPENCODEto this list for consistency.Similarly, check if
getMcpApiKeyStatus(around line 957) needs a case for'opencode'to returntrue.🛡️ Proposed fix
const providersWithoutApiKeys = [ CUSTOM_PROVIDERS.OLLAMA, CUSTOM_PROVIDERS.BEDROCK, CUSTOM_PROVIDERS.GEMINI_CLI, CUSTOM_PROVIDERS.GROK_CLI, CUSTOM_PROVIDERS.MCP, - CUSTOM_PROVIDERS.CODEX_CLI + CUSTOM_PROVIDERS.CODEX_CLI, + CUSTOM_PROVIDERS.OPENCODE ];And in
getMcpApiKeyStatusswitch statement (around line 1029):case 'codex-cli': return true; // OAuth/subscription via Codex CLI + case 'opencode': + return true; // OpenCode manages its own credentials case 'mistral':As per coding guidelines: "For providers not requiring an API key (like Ollama), add a specific check at the beginning of
isApiKeySetandgetMcpApiKeyStatusto returntrueimmediately for that provider."🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/modules/config-manager.js` around lines 887 - 948, Add CUSTOM_PROVIDERS.OPENCODE to the providersWithoutApiKeys list in isApiKeySet so Opencode is treated as not requiring an API key (OpencodeProvider.isRequiredApiKey() is false), and update getMcpApiKeyStatus to include a case for 'opencode' returning true so the MCP-specific check also treats Opencode as OK; modify the arrays/switch that reference provider names (providersWithoutApiKeys in isApiKeySet and the switch in getMcpApiKeyStatus) to handle the 'opencode' lowercase key consistently.
🧹 Nitpick comments (1)
src/ai-providers/opencode.js (1)
26-127: Align module shape with the provider-module contract used in this repo.This file currently exports only a provider class. The guideline for
src/ai-providers/*.jsrequires provider-specificgenerate<ProviderName>Text,stream<ProviderName>Text, andgenerate<ProviderName>Objectwrappers. Please add wrapper exports (can delegate to this class) or explicitly reconcile this architecture in the rule set.As per coding guidelines:
src/ai-providers/*.js: “Create a new provider module insrc/ai-providers/<provider-name>.jsthat implementsgenerate<ProviderName>Text,stream<ProviderName>Text, andgenerate<ProviderName>Objectfunctions using the Vercel AI SDK”.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/ai-providers/opencode.js` around lines 26 - 127, The module only exports OpencodeProvider but must also expose provider-module contract functions; add and export wrapper functions generateOpencodeText, streamOpencodeText, and generateOpencodeObject that delegate to OpencodeProvider (or its getClient) so the rest of the repo can call the Vercel-style helpers; locate the OpencodeProvider class and its getClient method and implement wrappers that create/obtain a client via getClient and call into the provider instance to perform text generation, streaming, and object generation semantics, then export these three functions alongside OpencodeProvider.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/ai-providers/opencode.js`:
- Around line 99-105: The current detection treats any error whose message
matches /opencode/i as "OpenCode not available", which can mislabel unrelated
init failures; update the conditional in the OpenCode init path (the variables
code, msg, error and the branch that creates enhancedError and calls
this.handleError) to use a stricter check: keep code === 'ENOENT' but replace
the loose /opencode/i test with a more specific condition such as checking
error.code === 'MODULE_NOT_FOUND' or testing that error.message contains both
"opencode" as a whole word and phrases like "not found" or "Cannot find module";
ensure you only create the enhancedError and call this.handleError('OpenCode
initialization', enhancedError) when the narrower detection confirms the CLI is
truly missing, otherwise let the original error propagate to the existing error
handling path.
- Around line 30-33: Guard against non-array returns from
getSupportedModelsForProvider('opencode') in the constructor by ensuring
this.supportedModels is always an array: check the result of
getSupportedModelsForProvider('opencode') and if it's not an Array assign an
empty array (or coerce via Array.isArray / Array.from fallback) before using
this.supportedModels.length and before any logic that assumes array methods;
update the constructor code where this.supportedModels is set so references to
supportedModels, the constructor itself, and getSupportedModelsForProvider are
handled safely.
---
Outside diff comments:
In `@scripts/modules/config-manager.js`:
- Around line 887-948: Add CUSTOM_PROVIDERS.OPENCODE to the
providersWithoutApiKeys list in isApiKeySet so Opencode is treated as not
requiring an API key (OpencodeProvider.isRequiredApiKey() is false), and update
getMcpApiKeyStatus to include a case for 'opencode' returning true so the
MCP-specific check also treats Opencode as OK; modify the arrays/switch that
reference provider names (providersWithoutApiKeys in isApiKeySet and the switch
in getMcpApiKeyStatus) to handle the 'opencode' lowercase key consistently.
---
Nitpick comments:
In `@src/ai-providers/opencode.js`:
- Around line 26-127: The module only exports OpencodeProvider but must also
expose provider-module contract functions; add and export wrapper functions
generateOpencodeText, streamOpencodeText, and generateOpencodeObject that
delegate to OpencodeProvider (or its getClient) so the rest of the repo can call
the Vercel-style helpers; locate the OpencodeProvider class and its getClient
method and implement wrappers that create/obtain a client via getClient and call
into the provider instance to perform text generation, streaming, and object
generation semantics, then export these three functions alongside
OpencodeProvider.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 60207bc1-8fba-4fac-880a-a0c441a2feaa
⛔ Files ignored due to path filters (1)
package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (9)
.changeset/add-opencode-provider.mdpackage.jsonpackages/tm-core/src/common/constants/providers.tsscripts/modules/ai-services-unified.jsscripts/modules/config-manager.jsscripts/modules/supported-models.jsonsrc/ai-providers/index.jssrc/ai-providers/opencode.jstests/unit/ai-providers/opencode.test.js
20287e3 to
f555101
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (1)
src/ai-providers/opencode.js (1)
99-105:⚠️ Potential issue | 🟠 MajorNarrow missing-CLI detection to avoid masking real init failures.
The
/opencode/ibranch is too broad and can misclassify unrelated errors as “OpenCode not available.”🔧 Suggested patch
- if (code === 'ENOENT' || /opencode/i.test(msg)) { + const missingCli = + code === 'ENOENT' || + /\b(opencode(\.exe)?\b).*(not found|is not recognized|cannot find module)/i.test( + msg + ); + if (missingCli) { const enhancedError = new Error( `OpenCode not available. Install it from: https://opencode.ai - Original error: ${error.message}` );🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/ai-providers/opencode.js` around lines 99 - 105, The current conditional in the OpenCode init branch (around the code that creates enhancedError and calls this.handleError) uses /opencode/i which is too broad and may misclassify unrelated errors; tighten the detection by only treating errors as "OpenCode not available" when the error is an ENOENT OR when the error indicates the opencode binary specifically (e.g., check error.path === 'opencode' or match error.message against a stricter pattern like /\bopencode\b.*\bnot found\b|\bcommand not found\b/i or inspect error.syscall === 'spawn' && error.path === 'opencode'); update the if condition that currently reads if (code === 'ENOENT' || /opencode/i.test(msg)) to use one of these stricter checks so only genuine missing-CLI cases create enhancedError and call this.handleError('OpenCode initialization', enhancedError).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@scripts/modules/config-manager.js`:
- Around line 65-66: Update the key-checking logic so the new provider
"opencode" is treated as keyless: in isApiKeySet(...) add an early-return true
when provider === 'opencode' (like the existing Ollama check), and do the same
at the start of getMcpApiKeyStatus(...) to immediately return true for
'opencode'; also update any providersWithoutApiKeys list/logic that enumerates
keyless providers to include 'opencode' so status and setup flows no longer mark
it as misconfigured (refer to the isApiKeySet and getMcpApiKeyStatus function
names and the opencode entry in the config object).
---
Duplicate comments:
In `@src/ai-providers/opencode.js`:
- Around line 99-105: The current conditional in the OpenCode init branch
(around the code that creates enhancedError and calls this.handleError) uses
/opencode/i which is too broad and may misclassify unrelated errors; tighten the
detection by only treating errors as "OpenCode not available" when the error is
an ENOENT OR when the error indicates the opencode binary specifically (e.g.,
check error.path === 'opencode' or match error.message against a stricter
pattern like /\bopencode\b.*\bnot found\b|\bcommand not found\b/i or inspect
error.syscall === 'spawn' && error.path === 'opencode'); update the if condition
that currently reads if (code === 'ENOENT' || /opencode/i.test(msg)) to use one
of these stricter checks so only genuine missing-CLI cases create enhancedError
and call this.handleError('OpenCode initialization', enhancedError).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 769b2274-b2d0-410e-965e-5878f69f0f2f
⛔ Files ignored due to path filters (1)
package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (9)
.changeset/add-opencode-provider.mdpackage.jsonpackages/tm-core/src/common/constants/providers.tsscripts/modules/ai-services-unified.jsscripts/modules/config-manager.jsscripts/modules/supported-models.jsonsrc/ai-providers/index.jssrc/ai-providers/opencode.jstests/unit/ai-providers/opencode.test.js
✅ Files skipped from review due to trivial changes (4)
- src/ai-providers/index.js
- .changeset/add-opencode-provider.md
- packages/tm-core/src/common/constants/providers.ts
- scripts/modules/supported-models.json
🚧 Files skipped from review as they are similar to previous changes (2)
- tests/unit/ai-providers/opencode.test.js
- package.json
The base branch was changed.
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (1)
scripts/modules/config-manager.js (1)
959-972:⚠️ Potential issue | 🟠 MajorReturn keyless provider status before filesystem checks in
getMcpApiKeyStatus.
getMcpApiKeyStatus('opencode', ...)can still returnfalseearly (Line 960-Line 972) when project root or.cursor/mcp.jsonis missing, because the keylesscase 'opencode'is reached too late.🔧 Proposed fix
function getMcpApiKeyStatus(providerName, projectRoot = null) { + const provider = providerName?.toLowerCase(); + if ( + provider === CUSTOM_PROVIDERS.OLLAMA || + provider === CUSTOM_PROVIDERS.CLAUDE_CODE || + provider === CUSTOM_PROVIDERS.CODEX_CLI || + provider === CUSTOM_PROVIDERS.GEMINI_CLI || + provider === CUSTOM_PROVIDERS.GROK_CLI || + provider === CUSTOM_PROVIDERS.MCP || + provider === CUSTOM_PROVIDERS.BEDROCK || + provider === CUSTOM_PROVIDERS.OPENCODE + ) { + return true; + } + const rootDir = projectRoot || findProjectRoot(); // Use existing root finding if (!rootDir) { console.warn( chalk.yellow('Warning: Could not find project root to check mcp.json.') ); return false; // Cannot check without root } @@ - switch (providerName) { + switch (provider) {As per coding guidelines: “For providers not requiring an API key (like Ollama), add a specific check at the beginning of
isApiKeySetandgetMcpApiKeyStatusto returntrueimmediately for that provider”.Also applies to: 988-1033
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/modules/config-manager.js` around lines 959 - 972, Add an early check at the top of getMcpApiKeyStatus (and mirror in isApiKeySet) to immediately return true for keyless providers (e.g., providerName === 'opencode' or other keyless providers like 'ollama') before any project root or filesystem checks; locate getMcpApiKeyStatus and isApiKeySet and insert the providerName === 'opencode' condition as the first branch so the function does not proceed to findProjectRoot() or check .cursor/mcp.json for these providers.
🧹 Nitpick comments (1)
scripts/modules/config-manager.js (1)
892-900: Avoid duplicating keyless-provider lists in two places.
providersWithoutApiKeysis defined both insideisApiKeySetand again as an exported constant. This will drift over time. Prefer one module-level source of truth reused by both functions.Also applies to: 1270-1278
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/modules/config-manager.js` around lines 892 - 900, The file defines providersWithoutApiKeys in two places; remove the duplicate local array inside isApiKeySet and instead reference the single module-level exported constant (providersWithoutApiKeys) so there is one source of truth; update isApiKeySet to import/use that exported constant and likewise replace the duplicate at the other location (the array used around lines 1270-1278) so both places reuse the same module-level providersWithoutApiKeys identifier (verify any export remains named providersWithoutApiKeys and adjust references accordingly).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/ai-providers/opencode.js`:
- Around line 131-134: The isModelSupported(modelId) check lowercases only the
input but not the entries in this.supportedModels, causing case-mismatch bugs;
fix by normalizing both sides—either precompute and store a normalized set/array
of supported model IDs (e.g., lowercased values) when initializing the provider,
or change isModelSupported to compare String(modelId).toLowerCase() against
this.supportedModels.map(s => String(s).toLowerCase()) (or a cached Set of
lowercased values) so comparisons are case-insensitive; update references to the
normalized collection accordingly and ensure null/undefined guarded behavior
remains.
---
Duplicate comments:
In `@scripts/modules/config-manager.js`:
- Around line 959-972: Add an early check at the top of getMcpApiKeyStatus (and
mirror in isApiKeySet) to immediately return true for keyless providers (e.g.,
providerName === 'opencode' or other keyless providers like 'ollama') before any
project root or filesystem checks; locate getMcpApiKeyStatus and isApiKeySet and
insert the providerName === 'opencode' condition as the first branch so the
function does not proceed to findProjectRoot() or check .cursor/mcp.json for
these providers.
---
Nitpick comments:
In `@scripts/modules/config-manager.js`:
- Around line 892-900: The file defines providersWithoutApiKeys in two places;
remove the duplicate local array inside isApiKeySet and instead reference the
single module-level exported constant (providersWithoutApiKeys) so there is one
source of truth; update isApiKeySet to import/use that exported constant and
likewise replace the duplicate at the other location (the array used around
lines 1270-1278) so both places reuse the same module-level
providersWithoutApiKeys identifier (verify any export remains named
providersWithoutApiKeys and adjust references accordingly).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 6424ef73-7c26-4c93-a65f-051d6acc6997
📒 Files selected for processing (4)
scripts/modules/config-manager.jssrc/ai-providers/opencode.jstests/unit/ai-services-unified.test.jstests/unit/config-manager.test.js
4c4307d to
356b1f6
Compare
|
Thanks for the fast review @Crunchyman-ralph. Changes made since your approval: Base branch changed to `main` as requested. Test failures fixed: the `ai-services-unified.test.js` mock was missing `OpencodeProvider` (exports), and `config-manager.test.js` fixtures needed the new `opencode` section. Both resolved. All 5 review threads resolved (cursor + CodeRabbit). Specifically:
Two additional fixes from end-to-end testing on my fork:
Happy to hold merge until the SDK lands a fix if you'd prefer the feature to be functional on day one. Alternatively, merging now is also fine — the provider registration, tests, and UX are all in place, and a single version bump of the SDK will activate it. |
Adds OpenCode as a new CLI-backed provider, mirroring the existing claude-code, codex-cli, and gemini-cli integrations. Task Master delegates authentication, model routing, and provider selection to a running OpenCode server via ai-sdk-provider-opencode-sdk, staying agnostic about the underlying LLM. Unblocks users whose organisations permit OpenCode (including its GitHub Copilot backend) but prohibit direct LLM API credentials - the original ask in eyaltoledano#965 that was closed against the rules-profile work in eyaltoledano#970 rather than the provider request itself. - Add OpencodeProvider wrapping ai-sdk-provider-opencode-sdk - Register in provider index, ai-services-unified, and CUSTOM_PROVIDERS - Seed supported-models.json with representative OpenCode model IDs (provider/model format) across Anthropic, OpenAI, Copilot, Gemini - Add opencode settings section to config-manager with command-specific override support, matching the pattern used by other CLI providers - Unit tests mirroring codex-cli.test.js coverage
- Include OPENCODE in providersWithoutApiKeys, isApiKeySet check, getMcpApiKeyStatus switch, and hasCodebaseAnalysis - OpenCode is a keyless CLI-backed provider and should match the pattern used by claude-code, gemini-cli, grok-cli, and codex-cli - Guard OpencodeProvider.supportedModels against non-array config return values - Narrow CLI-missing error detection so real SDK/config errors aren't relabelled as install issues - Add OpencodeProvider mock to ai-services-unified test to restore test suite green - Add opencode section to config-manager test fixture
- Add --opencode flag to `task-master models` and wire into provider-hint cascades for --set-main, --set-research, --set-fallback - Lazy-load ai-sdk-provider-opencode-sdk inside OpencodeProvider.getClient() so module-load-time failures in the SDK (e.g., the zod 4 incompatibility tracked at ben-vargas/ai-sdk-provider-opencode-sdk#17) do not crash task-master for users who never select opencode - Update opencode.test.js to reflect async getClient signature
… hint - ben-vargas/ai-sdk-provider-opencode-sdk#18 released as v0.0.3 fixes the zod 4 incompat. Bump the dep so OpenCode selection actually works end-to-end - Add OPENCODE case to setModel's providerHint cascade in scripts/modules/task-manager/models.js so --opencode is accepted
e7eaaa3 to
c17fff6
Compare
- getClient is synchronous again, matching the contract of other providers (ClaudeCodeProvider, CodexCliProvider). The lazy-load workaround for the zod 4 SDK bug is no longer needed now that ai-sdk-provider-opencode-sdk v0.0.3 is pinned (cursor, HIGH). - isModelSupported normalises both sides of the comparison so configured model IDs with mixed case are matched correctly (coderabbitai). - getMcpApiKeyStatus short-circuits for keyless providers (opencode, ollama, bedrock, claude-code, codex-cli, gemini-cli, grok-cli, mcp) before touching the filesystem - matches the guideline that these should not depend on .cursor/mcp.json existence (coderabbitai). - Run biome format to satisfy CI format check.
There was a problem hiding this comment.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
scripts/modules/commands.js (1)
3693-3703:⚠️ Potential issue | 🟠 MajorProvider-flag resolution is still incomplete (
--azureand--vertexare omitted).
--azure/--vertexare valid options, but they’re excluded from both conflict counting andproviderHintresolution. This can accept conflicting flags and also fail to pass the intended provider hint tosetModel().💡 Suggested fix
- const providerFlags = [ - options.openrouter, - options.ollama, - options.bedrock, - options.claudeCode, - options.geminiCli, - options.codexCli, - options.opencode, - options.lmstudio, - options.openaiCompatible - ].filter(Boolean).length; - if (providerFlags > 1) { + const selectedProviderHints = [ + ['openrouter', options.openrouter], + ['ollama', options.ollama], + ['bedrock', options.bedrock], + ['claude-code', options.claudeCode], + ['azure', options.azure], + ['vertex', options.vertex], + ['gemini-cli', options.geminiCli], + ['codex-cli', options.codexCli], + ['opencode', options.opencode], + ['lmstudio', options.lmstudio], + ['openai-compatible', options.openaiCompatible] + ] + .filter(([, enabled]) => enabled) + .map(([hint]) => hint); + + if (selectedProviderHints.length > 1) { console.error( chalk.red( - 'Error: Cannot use multiple provider flags (--openrouter, --ollama, --bedrock, --claude-code, --gemini-cli, --codex-cli, --opencode, --lmstudio, --openai-compatible) simultaneously.' + 'Error: Cannot use multiple provider flags (--openrouter, --ollama, --bedrock, --claude-code, --azure, --vertex, --gemini-cli, --codex-cli, --opencode, --lmstudio, --openai-compatible) simultaneously.' ) ); process.exit(1); } + const providerHint = selectedProviderHints[0]; @@ - providerHint: options.openrouter - ? 'openrouter' - : options.ollama - ? 'ollama' - : options.bedrock - ? 'bedrock' - : options.claudeCode - ? 'claude-code' - : options.geminiCli - ? 'gemini-cli' - : options.codexCli - ? 'codex-cli' - : options.opencode - ? 'opencode' - : options.lmstudio - ? 'lmstudio' - : options.openaiCompatible - ? 'openai-compatible' - : undefined, + providerHint, @@ - providerHint: options.openrouter - ? 'openrouter' - : options.ollama - ? 'ollama' - : options.bedrock - ? 'bedrock' - : options.claudeCode - ? 'claude-code' - : options.geminiCli - ? 'gemini-cli' - : options.codexCli - ? 'codex-cli' - : options.opencode - ? 'opencode' - : options.lmstudio - ? 'lmstudio' - : options.openaiCompatible - ? 'openai-compatible' - : undefined, + providerHint, @@ - providerHint: options.openrouter - ? 'openrouter' - : options.ollama - ? 'ollama' - : options.bedrock - ? 'bedrock' - : options.claudeCode - ? 'claude-code' - : options.geminiCli - ? 'gemini-cli' - : options.codexCli - ? 'codex-cli' - : options.opencode - ? 'opencode' - : options.lmstudio - ? 'lmstudio' - : options.openaiCompatible - ? 'openai-compatible' - : undefined, + providerHint,Also applies to: 3743-3761, 3778-3797, 3815-3833
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/modules/commands.js` around lines 3693 - 3703, The provider-flag counting and provider hint logic omit --azure and --vertex, causing incorrect conflict detection and wrong providerHint passed into setModel(); update every providerFlags (and related providerHint resolution) array that currently lists options.openrouter, options.ollama, options.bedrock, options.claudeCode, options.geminiCli, options.codexCli, options.opencode, options.lmstudio, options.openaiCompatible to also include options.azure and options.vertex so the conflict count and the computed providerHint include those flags and ensure the providerHint passed into setModel() reflects azure/vertex when specified.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Outside diff comments:
In `@scripts/modules/commands.js`:
- Around line 3693-3703: The provider-flag counting and provider hint logic omit
--azure and --vertex, causing incorrect conflict detection and wrong
providerHint passed into setModel(); update every providerFlags (and related
providerHint resolution) array that currently lists options.openrouter,
options.ollama, options.bedrock, options.claudeCode, options.geminiCli,
options.codexCli, options.opencode, options.lmstudio, options.openaiCompatible
to also include options.azure and options.vertex so the conflict count and the
computed providerHint include those flags and ensure the providerHint passed
into setModel() reflects azure/vertex when specified.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: a385f24b-771d-4f25-842f-cf1554b02dbd
⛔ Files ignored due to path filters (1)
package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (13)
.changeset/add-opencode-provider.mdpackage.jsonpackages/tm-core/src/common/constants/providers.tsscripts/modules/ai-services-unified.jsscripts/modules/commands.jsscripts/modules/config-manager.jsscripts/modules/supported-models.jsonscripts/modules/task-manager/models.jssrc/ai-providers/index.jssrc/ai-providers/opencode.jstests/unit/ai-providers/opencode.test.jstests/unit/ai-services-unified.test.jstests/unit/config-manager.test.js
✅ Files skipped from review due to trivial changes (4)
- src/ai-providers/index.js
- package.json
- packages/tm-core/src/common/constants/providers.ts
- .changeset/add-opencode-provider.md
🚧 Files skipped from review as they are similar to previous changes (5)
- scripts/modules/ai-services-unified.js
- tests/unit/config-manager.test.js
- tests/unit/ai-providers/opencode.test.js
- scripts/modules/supported-models.json
- scripts/modules/config-manager.js
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
❌ Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
Reviewed by Cursor Bugbot for commit abddd71. Configure here.
There was a problem hiding this comment.
♻️ Duplicate comments (1)
src/ai-providers/opencode.js (1)
102-107:⚠️ Potential issue | 🟠 MajorTighten missing-CLI detection to avoid masking real initialization errors.
Line 104 still treats any
"command failed: opencode"as “CLI missing,” which can hide real runtime/config failures when the CLI is installed but exits non-zero.Proposed fix
- const missingCli = - code === 'ENOENT' || - /spawn opencode ENOENT|opencode: command not found|command failed: opencode/i.test( - msg - ); + const missingCli = + code === 'ENOENT' || + /\bspawn\s+opencode\s+ENOENT\b/i.test(msg) || + /\bopencode(?:\.exe)?:\s*(?:command not found|not found|is not recognized)\b/i.test( + msg + );🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/ai-providers/opencode.js` around lines 102 - 107, The current missingCli detection (variable missingCli checking code and msg) is too broad because it treats any "command failed: opencode" as a missing-CLI case; update the check to only detect true "not installed" signals: keep code === 'ENOENT', match explicit patterns like /\b(opencode: command not found|spawn opencode ENOENT)\b/i and/or check for exit code 127 if available, and remove the generic /command failed: opencode/i alternative so real initialization errors from a present-but-failing opencode binary are not masked by the missingCli branch.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@src/ai-providers/opencode.js`:
- Around line 102-107: The current missingCli detection (variable missingCli
checking code and msg) is too broad because it treats any "command failed:
opencode" as a missing-CLI case; update the check to only detect true "not
installed" signals: keep code === 'ENOENT', match explicit patterns like
/\b(opencode: command not found|spawn opencode ENOENT)\b/i and/or check for exit
code 127 if available, and remove the generic /command failed: opencode/i
alternative so real initialization errors from a present-but-failing opencode
binary are not masked by the missingCli branch.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 781a0826-41af-4189-a9cd-05e1e7ce5c67
📒 Files selected for processing (4)
scripts/modules/commands.jsscripts/modules/config-manager.jssrc/ai-providers/opencode.jstests/unit/ai-providers/opencode.test.js
✅ Files skipped from review due to trivial changes (2)
- scripts/modules/commands.js
- tests/unit/ai-providers/opencode.test.js
🚧 Files skipped from review as they are similar to previous changes (1)
- scripts/modules/config-manager.js
The early short-circuit block added in this PR already returns true for CUSTOM_PROVIDERS.OPENCODE before the switch is reached, making the case dead code (cursor).
…ng doc
Replaces three invalid/unreachable model IDs in the seeded OpenCode
supported-models list:
- openai/gpt-5.2 -> github-copilot/gpt-5.2 (the openai/* prefix needs
direct OpenAI auth which most OpenCode users don't set up; the
Copilot path is the common one)
- openai/gpt-5.2-codex -> github-copilot/gpt-5.2-codex (same reason)
- github-copilot/gpt-5 -> github-copilot/gpt-4.1 (gpt-5 isn't a real
Copilot model name; the family is gpt-5.2 etc. Using gpt-4.1 here
as the proven-working default; gpt-5.2 is seeded separately.)
- google/gemini-2.5-pro -> github-copilot/gemini-2.5-pro (no bare
google/ path in default OpenCode model registry without direct
Google auth)
Also add a troubleshooting JSDoc block to the provider explaining the
two common failure modes ("Unexpected end of JSON input" on invalid
model IDs; "terminated" on slow reasoning models) and how to diagnose
them with `opencode models` and the `serverTimeout` setting.

Summary
Adds OpenCode as a new CLI-backed AI provider, mirroring the existing
claude-code,codex-cli, andgemini-cliintegrations. Task Master delegates authentication, model routing, and provider selection to a running OpenCode server viaai-sdk-provider-opencode-sdk(same community author as the other CLI-provider deps), staying agnostic about the underlying LLM.Closes the provider half of #965 (which was closed against the rules-profile work in #970) and addresses #1684.
Motivation
Users whose organisations permit OpenCode - including OpenCode's GitHub Copilot backend - but prohibit direct LLM API credentials currently cannot use Task Master at all. Every existing provider either needs a direct LLM API key, or is tied to a specific CLI tool that may not be approved. Delegating to a locally-running OpenCode server sidesteps that entirely: Task Master stays out of the authentication path.
End-to-end verified against a real OpenCode + GitHub Copilot setup -
parse-prdproduces valid task trees viagithub-copilot/gpt-4.1andgithub-copilot/gpt-5.2.Changes
src/ai-providers/opencode.js- NewOpencodeProviderwrappingai-sdk-provider-opencode-sdk@^0.0.3. SynchronousgetClientmatching the other CLI providers' contract. Includes a JSDoc troubleshooting block for the two common upstream SDK failure modes (see "Known caveats" below).src/ai-providers/index.js- Export the new provider class.scripts/modules/ai-services-unified.js- Register under keyopencodein thePROVIDERSmap.scripts/modules/config-manager.js- Addopencodesection toDEFAULTS, merge in_loadAndValidateConfig, exposegetOpencodeSettings/getOpencodeSettingsForCommand, add OpenCode to the keyless-provider short-circuit ingetMcpApiKeyStatusand tohasCodebaseAnalysis/providersWithoutApiKeys/isApiKeySetso it's consistently treated as a no-API-key provider.scripts/modules/commands.js- Add--opencodeflag totask-master modelsalongside the other CLI-provider flags, wire it into the provider-hint cascades for--set-main/--set-research/--set-fallback.scripts/modules/task-manager/models.js- Add OPENCODE case tosetModel'sproviderHinthandling; accepts unknown model IDs with a warning since OpenCode's model catalog is dynamic.packages/tm-core/src/common/constants/providers.ts- AddOPENCODE: 'opencode'toCUSTOM_PROVIDERS.scripts/modules/supported-models.json- Seed with 7 verified model IDs that exist in a default OpenCode install (Anthropic direct + GitHub Copilot path). IDs cross-checked againstopencode modelsoutput so the starter list doesn't ship with unreachable entries.tests/unit/ai-providers/opencode.test.js- 5 new tests mirroringcodex-cli.test.js: construction, API-key-not-required, client creation via config-manager, case-insensitive model-support check, error wrapping on missing CLI.tests/unit/ai-providers/ai-services-unified.test.js/tests/unit/config-manager.test.js- Mock updates to accommodate the new provider..changeset/add-opencode-provider.md- User-facing changeset (minor bump).Testing
npm test tests/unit/ai-providers/- 229 tests pass across 18 suites including the 5 new OpencodeProvider tests.npm run turbo:typecheck- clean across all 9 packages.biome format- passes.task-master parse-prdgenerated a valid 3-task tree viaopencodeprovider withgithub-copilot/gpt-4.1backend. Proves the full plumbing: Task Master → OpencodeProvider → ai-sdk-provider-opencode-sdk → OpenCode server → GitHub Copilot.Risk Assessment
task-master models --setupor--opencodeflag.ai-sdk-provider-opencode-sdk@^0.0.3(theai-sdk-v5tag line, matching the repo'sai@^5.0.51). Same community maintainer as the existingai-sdk-provider-claude-code,-codex-cli,-gemini-clideps. v0.0.3 specifically includes the zod 3/4 compatibility fix (Fix zod 4 incompatibility in loggerSchema (closes #17) ben-vargas/ai-sdk-provider-opencode-sdk#18 - landed during this PR's review) required to even import the SDK in this project's zod-4 tree.validateAuth()performs a lightweightopencode --versionprobe and surfaces install guidance if missing, matching the pattern incodex-cli.js.Known caveats
Two failure modes that surface from upstream
@opencode-ai/sdkand aren't fixable in this repo:github-copilot/gpt-5instead ofgpt-5.2) producesUnexpected end of JSON inputbecause the underlying client blindly callsresponse.json()on OpenCode's empty error body. Mitigation: seeded model list is verified against a real OpenCode install; the provider's JSDoc directs users toopencode modelsif they hit this.TypeError: terminatedwhen OpenCode's default HTTP timeout expires mid-response. Mitigation: JSDoc directs users at theserverTimeoutsetting in.taskmaster/config.json'sopencodesection.Both are tracked upstream at ben-vargas/ai-sdk-provider-opencode-sdk#21. The SDK maintainer has indicated the
ai-sdk-v5tag line is frozen, so a deeper fix would require either a task-master-side fork of the SDK or a migration of the wider task-master codebase to AI SDK v6 - both out of scope for this PR.Deployment Notes
task-master models --setuppicks up the new provider automatically via the existing supported-models-driven flow.Test plan
task-master models --setupsurfaces OpenCode and accepts a model selectiontask-master parse-prdsucceeds with OpenCode selected as main model (verified withgithub-copilot/gpt-4.1)