Skip to main content

Documentation Index

Fetch the complete documentation index at: https://cognisafeltd.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

How it works

Azure OpenAI uses the same OpenAI-compatible protocol, so Cognisafe intercepts it via proxy mode — the same mechanism as standard OpenAI. The Cognisafe proxy is configured to forward to your Azure resource’s endpoint instead of https://api.openai.com.

Proxy configuration

Set UPSTREAM_URL on the Cognisafe proxy to your Azure OpenAI resource endpoint:
# In your proxy service environment
UPSTREAM_URL=https://your-resource-name.openai.azure.com
No other proxy changes are needed. Azure-specific parameters (api-version, deployment name) are passed through unchanged.

SDK setup

The SDK patch is identical to standard OpenAI. Tell the Azure client to use the proxy URL as its endpoint:
import cognisafe
from openai import AzureOpenAI

cognisafe.configure(
    api_key="csk_your_key_here",
    project_id="my-app",
)
cognisafe.patch_openai()

# Point AzureOpenAI at the Cognisafe proxy
client = AzureOpenAI(
    azure_endpoint="http://localhost:8080",  # Cognisafe proxy
    api_version="2024-02-01",
    api_key="your-azure-openai-key",
)

response = client.chat.completions.create(
    model="gpt-4o-deployment",   # your Azure deployment name
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "What is Azure OpenAI?"},
    ],
)

print(response.choices[0].message.content)

Self-hosted deployment

When running Cognisafe on Railway or a VM alongside Azure infrastructure, set API_BACKEND_URL on the proxy to your deployed API service URL and UPSTREAM_URL to your Azure resource:
# Proxy service env vars
UPSTREAM_URL=https://your-resource-name.openai.azure.com
API_BACKEND_URL=https://api.cognisafe.uk   # or your self-hosted API URL
PROXY_API_KEY=csk_your_key_here
Point your Azure OpenAI client’s azure_endpoint at the deployed proxy URL (e.g., https://proxy.cognisafe.uk).
Azure OpenAI API versions are passed through as-is. The proxy does not interpret or modify the api-version query parameter.
If your Azure OpenAI resource uses managed identity or Azure AD token authentication (rather than API key), you will need to pass the token in the Authorization header. The proxy forwards all headers to the upstream unchanged.