How to Configure BYOLLM
BYOLLM (Bring Your Own LLM) allows Enterprise customers to use their own AI provider API keys instead of aprity's default LLM infrastructure.
Prerequisites
- Enterprise plan.
- An active API key from a supported provider.
Supported Providers
| Provider | Requirements |
|---|---|
| Azure OpenAI | Azure subscription, deployed model, API key and endpoint |
| Anthropic (Claude) | Anthropic API key |
| OpenAI | OpenAI API key |
How to Set Up BYOLLM
BYOLLM configuration is handled by the aprity support team to ensure secure key management.
- Contact support@aprity.ai with your BYOLLM request.
- Provide:
- Your preferred LLM provider.
- For Azure OpenAI: the deployment endpoint URL and model name.
- The aprity team securely configures your API keys.
- You receive confirmation once the setup is complete.
warning
Never share API keys through unencrypted email. The aprity support team will provide a secure channel for key exchange during setup.
How BYOLLM Affects Documentation
- Documentation quality depends on the model you choose. aprity is optimized for Claude and GPT-4 class models.
- Analysis prompts remain the same regardless of provider.
- Token usage is billed by your LLM provider, not by aprity.
Reverting to Default
To switch back to aprity's default LLM infrastructure, contact support@aprity.ai. The change takes effect on the next scan.
info
BYOLLM does not affect deterministic processing (metadata extraction, dependency graphs, scan orchestration). Only the AI explanation and documentation synthesis phases use your custom LLM.