Update the authenticated organization’s LLM settings.
Either field may be omitted to preserve the stored value. The service rejects provider selections that are neither the platform default nor a connected model resource.
Returns: Updated settings row.
Patch body for PATCH /settings/llm.
Either field may be omitted, in which case the existing value is preserved. Callers that want to unset either value must delete and recreate the settings row from the engine side.
Attributes:
provider: New LLM provider resource id, or
"platform_default" to fall back to the shared
platform-managed provider.
performance_profile: New performance profile tier.
User-facing LLM performance profile selection.
Identifies which tier of model an organization wants platform agents to use. The profile is chosen by the user; the API resolves it to a concrete model from the selected provider's catalog entries.
fast, balanced, reasoning Successful Response
Per-organization LLM settings for platform agents.
Holds the organization's choice of LLM provider resource and performance profile. The API reads these settings whenever a platform agent is invoked to resolve the concrete model and credentials to use.
Attributes: organization_id: Owning organization. provider: Resource identifier of the LLM provider resource the organization has selected. Opaque string; format matches the rest of the SDK's resource id references (SurrealDB record id shape, e.g. 'resources:anthropic_default_abc123'). Use 'platform_default' to reference the shared platform provider. performance_profile: Which tier of catalog model to use for platform agent invocations. updated_at: Timestamp of the last settings change.
User-facing LLM performance profile selection.
Identifies which tier of model an organization wants platform agents to use. The profile is chosen by the user; the API resolves it to a concrete model from the selected provider's catalog entries.
fast, balanced, reasoning