Skip to main content
orchagent uses a Bring Your Own Key model. You provide your own LLM API keys, and agents use them to make LLM calls.
When BYOK applies: BYOK is relevant for orch run (cloud execution, the default) and orch run --local (local execution). For orch install, no LLM keys are needed since you’re just exporting configuration files.

Why BYOK?

BenefitDescription
Authors don’t payAgent authors don’t pay for others’ usage
No markupDirect relationship with LLM provider, no middleman costs
Your limitsUse your existing rate limits and quotas
Your data policiesLLM calls go through your account
API keys are separate from subscriptions. If you use Claude Pro/Max or ChatGPT Plus, you’ll need to set up API billing separately. Get API keys at console.anthropic.com or platform.openai.com.

Key Resolution Order

When an agent needs an LLM key, it looks in this order:
  1. Command-line flag--key on CLI commands
  2. Environment variableOPENAI_API_KEY, ANTHROPIC_API_KEY, etc. (local execution)
  3. Workspace secrets vault — keys stored in your workspace (server execution)
LLM keys are stored as regular workspace secrets with conventional names (ANTHROPIC_API_KEY, OPENAI_API_KEY, GEMINI_API_KEY). The platform matches these names automatically.

Setting Up Keys

For Local Execution

Set environment variables:
# OpenAI
export OPENAI_API_KEY="sk-..."

# Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."

# Google Gemini
export GEMINI_API_KEY="..."
Then run agents locally:
orch run --local acme/summarizer --input '{"text": "..."}'

For Server Execution

Store keys in your workspace secrets vault:
# Add your LLM API key to the workspace vault
orch secrets set ANTHROPIC_API_KEY sk-ant-...

# Or for OpenAI
orch secrets set OPENAI_API_KEY sk-...
Or add them in the dashboard under Settings → Secrets. Then run agents on the cloud:
orch run acme/summarizer --data '{"text": "..."}'

Supported Providers

ProviderEnvironment VariableAPI Endpoint
OpenAIOPENAI_API_KEYhttps://api.openai.com/v1
AnthropicANTHROPIC_API_KEYhttps://api.anthropic.com
Google GeminiGEMINI_API_KEYhttps://generativelanguage.googleapis.com
OllamaN/Ahttp://localhost:11434/v1 (local execution only)

Agent Provider Requirements

Agents specify which providers they support in their manifest:
{
  "supported_providers": ["openai", "anthropic"]
}
You need a key for at least one supported provider.

Provider Values

ValueDescription
openaiOpenAI API
anthropicAnthropic API
geminiGoogle Gemini API
anyWorks with any provider
ollamaLocal Ollama instance (local execution only)

Fallback Configuration

For code runtime agents, authors can specify fallback LLMs:
llm:
  primary: gemini-2.5-flash
  fallbacks:
    - gpt-4o-mini
    - claude-3-haiku
The agent tries providers in order until one succeeds.

Security

Local Execution

Keys stay on your machine. The agent runs locally and makes LLM calls directly from your environment.

Server Execution

Keys are encrypted and stored in your workspace vault. They’re injected into the agent’s sandbox at runtime.
Agent containers run in isolated environments. Your keys are never exposed to agent authors.

Network Egress Controls

Server-executed agents route all outbound traffic through an allowlist proxy: Allowed destinations:
  • LLM APIs (OpenAI, Anthropic, Gemini)
  • orchagent gateway (api.orchagent.io)
  • Other orchagent agents
Blocked:
  • All other domains
  • Private IP ranges
  • Cloud metadata endpoints

Best Practices

  1. Use environment variables for local development
  2. Store keys in workspace vault for server execution (orch secrets set or dashboard)
  3. Set spending limits with your LLM provider
  4. Rotate keys regularly if you share them
  5. Use separate keys for development and production