Mistral AI
Integrate Mistral AI models into your applications
Pythoneasy
The Mistral SDK provides access to Mistral AI models. Decyra captures all Mistral API calls, enabling observability and control over your Mistral model interactions.
Prerequisites
- Decyra account with API key
- Node.js 18+ installed
- @mistralai/mistralai package installed
Installation
npm install @mistralai/mistralai @decyra/sdk
Integration
Configure Mistral client with Decyra's proxy:
import { Mistral } from '@mistralai/mistralai';
const client = new Mistral({
apiKey: process.env.MISTRAL_API_KEY!,
serverURL: 'https://proxy.decyra.com',
defaultHeaders: {
'X-Decyra-API-Key': process.env.DECYRA_API_KEY!,
},
});
Use the client for chat completion:
const chatResponse = await client.chat.complete({
model: 'mistral-large-latest',
messages: [
{ role: 'user', content: 'Explain transformer architecture' }
],
temperature: 0.7,
});
console.log(chatResponse.choices[0].message.content);
What Gets Captured
| Field | Description |
|---|---|
| Model | Mistral model identifier |
| Temperature | Temperature parameter |
| Messages | Conversation messages array |
| Prompt Hash | Hash of the prompt |
| Response Time | API call duration |
| Token Usage | Input/output tokens |
| Cost | Estimated API cost |
| Max Tokens | Maximum response tokens |
Verify
Check your Decyra dashboard to see Mistral API calls in the traces view. Each chat completion will appear with full details.