Burg AIburg ai
ConfigurationSecurity & Privacy

LLM Providers

How third-party LLM providers are used

LLM Providers

How Third-Party Models Are Used

Burg AI sends your code to the LLM provider you choose using your API key.

We support:

  • OpenAI (GPT-4 series)
  • Anthropic (Claude 3 series)
  • OpenRouter (multiple models)
  • Google Gemini

What Data Is Sent

To the LLM provider:

SentNot Sent
PR diffFull repository
Changed file contentUnchanged files
File pathsUser identity
Custom instructionsAPI keys of other users

BYOK Implications

With BYOK:

  • You control costs — Usage is billed to your account
  • You control data policies — Subject to your agreement with the provider
  • You can audit usage — Check your provider dashboard

Provider Data Policies

Each provider has their own data retention policy:

ProviderTraining on API DataData Retention
OpenAINo (API data not used for training)30 days for abuse monitoring
AnthropicNoEphemeral
GoogleVaries by model/regionVaries
OpenRouterPass-throughDepends on underlying model

Check each provider's terms for current policies.

Burg AI's Role

Burg AI:

  • Routes your request to the provider
  • Never stores the response after posting
  • Never trains on your code
  • Cannot access other users' data via your key