ConfigurationConfiguration
BYOK (Bring Your Own Key)
Use your own LLM API key for code reviews
BYOK (Bring Your Own Key)
Burg AI uses a BYOK architecture — you provide your own LLM API key. This gives you:
- Cost control — You pay the LLM provider directly
- Privacy — Your code goes through your own account
- Flexibility — Use the provider you prefer
Supported Providers
| Provider | Models | How to Get Key |
|---|---|---|
| OpenAI | GPT-4, GPT-4 Turbo | platform.openai.com |
| Anthropic | Claude 3, Claude 3.5 | console.anthropic.com |
| OpenRouter | Multiple models | openrouter.ai |
| Google Gemini | Gemini Pro, Gemini Ultra | aistudio.google.com |
How Keys Are Stored
- Keys are encrypted at rest using AES-256
- Keys are never logged in plain text
- Keys are only decrypted when processing your PRs
- You can delete your key at any time
Adding Your Key
- Go to Settings → API Keys
- Select your provider
- Paste your API key
- Click Save
The key is validated immediately — you'll see an error if it's invalid.
What Happens If a Key Is Invalid
| Scenario | What Happens |
|---|---|
| Invalid key | Review fails with "Invalid API key" error |
| Rate limited | Review fails with provider error message |
| Quota exceeded | Review fails with quota error |
| Key deleted | Reviews stop; add a new key to resume |
Error Messages
When a key issue occurs, you'll see:
- The exact error from the provider
- Suggested next steps
- A link to update your key
Estimating Costs
Token usage depends on:
- PR size (files and lines changed)
- Review mode (full uses more tokens)
- Model selected
Typical cost per review: $0.01 – $0.10 depending on PR size and model.
Key Security Best Practices
- Use a dedicated API key for Burg AI
- Set usage limits on your provider dashboard
- Rotate keys periodically
- Monitor usage on your provider's dashboard