Skip to main content
Define LLM configurations that agents and teams use for inference.

agno/models/anthropic

Configures an Anthropic Claude model.

Config

FieldTypeRequiredDefaultDescription
idstringYesModel identifier (e.g., claude-sonnet-4-20250514)
api_keyField[string]YesAnthropic API key. Use a FieldReference to inject from a secret
max_tokensintegerNo8192Maximum tokens in responses
temperaturefloatNoSampling temperature (0.0–1.0)
top_pfloatNoNucleus sampling parameter
top_kintegerNoTop-k sampling parameter
stop_sequenceslist[string]NoSequences that stop generation

Outputs

FieldTypeDescription
specobjectSerialized model spec for runtime reconstruction

Example

provider: agno
resource: models/anthropic
name: claude
config:
  id: claude-sonnet-4-20250514
  api_key:
    provider: pragma
    resource: secret
    name: anthropic-key
    field: outputs.ANTHROPIC_API_KEY
  max_tokens: 4096
  temperature: 0.7

Dependencies

Depends on: Nothing directly, but api_key typically uses a FieldReference to a pragma/secret resource. Depended on by:
  • agno/agent — as the required model
  • agno/team — as the optional team lead model
  • agno/memory/manager — as the optional classification model

Notes

  • Both model types are stateless — they wrap configuration without making API calls during creation.
  • The api_key field supports FieldReferences for injecting secrets from a pragma/secret resource.
  • The base_url field on the OpenAI model allows you to use OpenAI-compatible APIs (e.g., Azure OpenAI, local models).