When BYOK applies: BYOK is relevant for
orch run (cloud execution, the default) and orch run --local (local execution). For orch install, no LLM keys are needed since you’re just exporting configuration files.Why BYOK?
| Benefit | Description |
|---|---|
| Authors don’t pay | Agent authors don’t pay for others’ usage |
| No markup | Direct relationship with LLM provider, no middleman costs |
| Your limits | Use your existing rate limits and quotas |
| Your data policies | LLM calls go through your account |
API keys are separate from subscriptions. If you use Claude Pro/Max or ChatGPT Plus, you’ll need to set up API billing separately. Get API keys at console.anthropic.com or platform.openai.com.
Key Resolution Order
When an agent needs an LLM key, it looks in this order:- Command-line flag —
--keyon CLI commands - Environment variable —
OPENAI_API_KEY,ANTHROPIC_API_KEY, etc. (local execution) - Workspace secrets vault — keys stored in your workspace (server execution)
ANTHROPIC_API_KEY, OPENAI_API_KEY, GEMINI_API_KEY). The platform matches these names automatically.
Setting Up Keys
For Local Execution
Set environment variables:For Server Execution
Store keys in your workspace secrets vault:Supported Providers
| Provider | Environment Variable | API Endpoint |
|---|---|---|
| OpenAI | OPENAI_API_KEY | https://api.openai.com/v1 |
| Anthropic | ANTHROPIC_API_KEY | https://api.anthropic.com |
| Google Gemini | GEMINI_API_KEY | https://generativelanguage.googleapis.com |
| Ollama | N/A | http://localhost:11434/v1 (local execution only) |
Agent Provider Requirements
Agents specify which providers they support in their manifest:Provider Values
| Value | Description |
|---|---|
openai | OpenAI API |
anthropic | Anthropic API |
gemini | Google Gemini API |
any | Works with any provider |
ollama | Local Ollama instance (local execution only) |
Fallback Configuration
For code runtime agents, authors can specify fallback LLMs:Security
Local Execution
Keys stay on your machine. The agent runs locally and makes LLM calls directly from your environment.Server Execution
Keys are encrypted and stored in your workspace vault. They’re injected into the agent’s sandbox at runtime.Agent containers run in isolated environments. Your keys are never exposed to agent authors.
Network Egress Controls
Server-executed agents route all outbound traffic through an allowlist proxy: Allowed destinations:- LLM APIs (OpenAI, Anthropic, Gemini)
- orchagent gateway (
api.orchagent.io) - Other orchagent agents
- All other domains
- Private IP ranges
- Cloud metadata endpoints
Best Practices
- Use environment variables for local development
- Store keys in workspace vault for server execution (
orch secrets setor dashboard) - Set spending limits with your LLM provider
- Rotate keys regularly if you share them
- Use separate keys for development and production