Skip to content

feat: migrate to Vercel AI SDK v6 #1686

@bjcoombs

Description

@bjcoombs

Problem

Task Master is currently on Vercel AI SDK v5 (ai@^5.0.51). The wider ecosystem, including all of the community CLI-backed provider SDKs this repo depends on (authored by @ben-vargas), has moved to AI SDK v6 as latest. The v5-compatible lines are frozen for bug-fix-only maintenance, with fixes for new model/routing issues landing on the v6 line only.

This creates two pressures:

  1. New bugs in the v5-line providers will not be fixed upstream. Ben stated this explicitly on ben-vargas/ai-sdk-provider-opencode-sdk#21: "I'm not planning to put much (any?) effort into maintaining ai sdk v5 compatible providers given v6 is latest and v7 is in beta."
  2. Known issues may already be fixed on v6 but inaccessible from v5. Candidates include bug: 301-Second Timeout on Large PRD Parsing (Client Disconnect) #1497 (301-second timeout on large PRD parsing) and the "terminated" / "Unexpected end of JSON input" failure modes documented in feat: add OpenCode as an AI provider #1685 for the OpenCode provider.

Technical Context

All four community CLI provider SDKs this repo depends on already publish a v6-compatible latest tag:

Package v5 line (current) v6 line (latest)
ai-sdk-provider-claude-code 2.2.4 3.4.4
ai-sdk-provider-codex-cli 0.7.0 1.1.0
ai-sdk-provider-gemini-cli 1.4.0 2.0.1
ai-sdk-provider-opencode-sdk 0.0.3 3.0.2

Zod is already at ^4.1.12, which satisfies every v6-line peer range.

The main-SDK surface to touch:

  • ai@^5.0.51ai@^6.x
  • 9x @ai-sdk/*@^2.x packages (anthropic, openai, google, groq, mistral, perplexity, xai, azure, openai-compatible) → their v6-compatible majors
  • @ai-sdk/amazon-bedrock@^3.0.23 and @ai-sdk/google-vertex@^3.0.86 - verify v6 majors
  • @ai-sdk/provider@^2.0.0 and @ai-sdk/provider-utils@^3.0.10 - bump to match

Impact

Not covered by this work:

Scope

In:

  1. Bump ai to v6 and audit call-sites for v5→v6 breaking API changes (return-shape changes on generateText / streamText / generateObject, provider factory signatures, middleware changes)
  2. Bump all @ai-sdk/* Vercel packages to v6-compatible majors
  3. Bump Ben's 4 community provider SDKs to their latest (v6) tag
  4. Update provider files in src/ai-providers/ that touch the changed APIs
  5. Update tests and mocks affected by the SDK surface changes
  6. Changeset (likely major bump given the dep churn)

Out:

Success Criteria

  1. ai@^6 and all compatible @ai-sdk/* / community providers installed
  2. Full test suite passes
  3. All 21 registered providers continue to work for generateText, streamText, generateObject
  4. Live E2E sanity check against at least one Anthropic/OpenAI/Copilot/local model
  5. Changeset entry

Ask

Would maintainers welcome a migration PR? If green-lit, I'll inventory the exact breaking API surface across the 13+ dep bumps, propose a sequenced plan before writing code, and keep the diff tight (dep bumps + the minimum call-site adjustments to keep the suite green).

Happy to land this after #1685 merges to avoid overlapping changes.

Metadata

Metadata

Assignees

No one assigned

    Labels

    area:ai-modelsAI model integration and configurationenhancementNew feature or requesthigh-priorityUrgent issue requiring immediate attentionrefactorChanges needed to code

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions