Back to Guides

Mistral AI

Integrate Mistral AI models into your applications

Pythoneasy

The Mistral SDK provides access to Mistral AI models. Decyra captures all Mistral API calls, enabling observability and control over your Mistral model interactions.

Prerequisites

  • Decyra account with API key
  • Node.js 18+ installed
  • @mistralai/mistralai package installed

Installation

npm install @mistralai/mistralai @decyra/sdk

Integration

Configure Mistral client with Decyra's proxy:

import { Mistral } from '@mistralai/mistralai';

const client = new Mistral({
  apiKey: process.env.MISTRAL_API_KEY!,
  serverURL: 'https://proxy.decyra.com',
  defaultHeaders: {
    'X-Decyra-API-Key': process.env.DECYRA_API_KEY!,
  },
});

Use the client for chat completion:

const chatResponse = await client.chat.complete({
  model: 'mistral-large-latest',
  messages: [
    { role: 'user', content: 'Explain transformer architecture' }
  ],
  temperature: 0.7,
});

console.log(chatResponse.choices[0].message.content);

What Gets Captured

FieldDescription
ModelMistral model identifier
TemperatureTemperature parameter
MessagesConversation messages array
Prompt HashHash of the prompt
Response TimeAPI call duration
Token UsageInput/output tokens
CostEstimated API cost
Max TokensMaximum response tokens

Verify

Check your Decyra dashboard to see Mistral API calls in the traces view. Each chat completion will appear with full details.

Next Steps