Documentation Index
Fetch the complete documentation index at: https://cognisafeltd.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
How it works
Azure OpenAI uses the same OpenAI-compatible protocol, so Cognisafe intercepts it via proxy mode — the same mechanism as standard OpenAI. The Cognisafe proxy is configured to forward to your Azure resource’s endpoint instead ofhttps://api.openai.com.
Proxy configuration
SetUPSTREAM_URL on the Cognisafe proxy to your Azure OpenAI resource endpoint:
api-version, deployment name) are passed through unchanged.
SDK setup
The SDK patch is identical to standard OpenAI. Tell the Azure client to use the proxy URL as its endpoint:Self-hosted deployment
When running Cognisafe on Railway or a VM alongside Azure infrastructure, setAPI_BACKEND_URL on the proxy to your deployed API service URL and UPSTREAM_URL to your Azure resource:
azure_endpoint at the deployed proxy URL (e.g., https://proxy.cognisafe.uk).
Azure OpenAI API versions are passed through as-is. The proxy does not interpret or modify the
api-version query parameter.
