Documentation Index
Fetch the complete documentation index at: https://cognisafeltd.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Installation
cognisafe. It ships with full TypeScript types — no @types/ package needed.
TypeScript support
The SDK is written in TypeScript and exports proper type definitions. It works with bothimport (ESM) and require (CJS):
Configuration
Callcognisafe.configure() once at startup, before any LLM calls:
Configuration options
| Option | Type | Required | Default | Description |
|---|---|---|---|---|
apiKey | string | Yes* | COGNISAFE_API_KEY | Your project API key (csk_...) |
projectId | string | Yes* | COGNISAFE_PROJECT_ID | Project identifier for grouping requests |
proxyUrl | string | No | http://localhost:8080 | Cognisafe Go proxy URL |
apiUrl | string | No | http://localhost:8000 | Cognisafe FastAPI backend URL |
Patching providers
OpenAI
patchOpenAI() sets the baseURL on the default OpenAI client to the Cognisafe proxy. The proxy forwards to https://api.openai.com and logs every call asynchronously.
Anthropic
messages.create and ships payloads to the backend after the response is returned to your code.
The trace wrapper
For custom LLM endpoints or any call not covered by a patch method:
trace returns a wrapped version of your function. The wrapper captures inputs and outputs and sends them to /internal/log after the function resolves — your code is not blocked.
The Node SDK is published under the same package name (
cognisafe) as the Python SDK but is a completely separate implementation. Install via npm install cognisafe, not pip.
