Skip to main content

How to Configure BYOLLM

BYOLLM (Bring Your Own LLM) allows Enterprise customers to use their own AI provider API keys instead of aprity's default LLM infrastructure.

Prerequisites

  • Enterprise plan.
  • An active API key from a supported provider.

Supported Providers

ProviderRequirements
Azure OpenAIAzure subscription, deployed model, API key and endpoint
Anthropic (Claude)Anthropic API key
OpenAIOpenAI API key

How to Set Up BYOLLM

BYOLLM configuration is handled by the aprity support team to ensure secure key management.

  1. Contact support@aprity.ai with your BYOLLM request.
  2. Provide:
    • Your preferred LLM provider.
    • For Azure OpenAI: the deployment endpoint URL and model name.
  3. The aprity team securely configures your API keys.
  4. You receive confirmation once the setup is complete.
warning

Never share API keys through unencrypted email. The aprity support team will provide a secure channel for key exchange during setup.

How BYOLLM Affects Documentation

  • Documentation quality depends on the model you choose. aprity is optimized for Claude and GPT-4 class models.
  • Analysis prompts remain the same regardless of provider.
  • Token usage is billed by your LLM provider, not by aprity.

Reverting to Default

To switch back to aprity's default LLM infrastructure, contact support@aprity.ai. The change takes effect on the next scan.

info

BYOLLM does not affect deterministic processing (metadata extraction, dependency graphs, scan orchestration). Only the AI explanation and documentation synthesis phases use your custom LLM.