Skip to main content

Documentation Index

Fetch the complete documentation index at: https://cognisafeltd.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

How it works

Mistral uses proxy mode. The Mistral API is OpenAI-compatible, so the Cognisafe SDK rewrites the Mistral client’s base URL to point to the proxy. Requests flow through the proxy to https://api.mistral.ai and are logged asynchronously.

Installation

pip install "cognisafe[all]"

Setup

import cognisafe
from mistralai import Mistral

cognisafe.configure(
    api_key="csk_your_key_here",
    project_id="my-app",
)
cognisafe.patch_mistral()

client = Mistral(api_key="your-mistral-api-key")

response = client.chat.complete(
    model="mistral-large-latest",
    messages=[{"role": "user", "content": "Hello from Mistral!"}],
)

print(response.choices[0].message.content)

Supported models

All Mistral models are supported. The proxy forwards to https://api.mistral.ai by default. To change the upstream, set the UPSTREAM_URL environment variable on the proxy service.

Proxy configuration

Set UPSTREAM_URL on the Cognisafe proxy to point at Mistral’s API:
UPSTREAM_URL=https://api.mistral.ai
If you run multiple proxy instances for different providers, deploy separate proxy containers with different UPSTREAM_URL values.