Claude Sonnet 4 now supports 1M tokens of context

Anthropic adds a 1M-token context window to Claude Sonnet 4, enabling whole-codebase analysis, large document synthesis, and more.

/images/news/og/anthropic-claude-sonnet-4-1m-context-og.png
{provider: Claude Sonnet 4 now supports 1M tokens of context}

What’s new: Anthropic introduced a 1M-token context window for Claude Sonnet 4, now in public beta on the Anthropic API and Amazon Bedrock, with Google Cloud Vertex AI support “coming soon.”

Key details

  • Use cases highlighted: whole-codebase analysis, long-form document synthesis, and more persistent, context-aware agents.
  • Pricing adjusts for prompts over 200K tokens; Anthropic points to prompt caching and batch processing to manage latency and cost.
  • Early adopters cited include Bolt.new and iGent AI.

Why it matters: Ultra-long context unlocks workflows that keep broad state (docs, APIs, histories) in-memory, reducing brittle retrieval chains and manual chunking. For teams building agents or code tools, this can simplify architecture and improve answer quality—at the cost of bigger prompt budgets.

Source: anthropic.com