Back to Guides

Azure OpenAI

Use OpenAI models through Azure's enterprise platform

Pythonmedium

Azure OpenAI provides managed access to OpenAI models. Decyra captures all Azure OpenAI API calls, providing observability and control over your Azure AI workloads.

Prerequisites

  • Decyra account with API key
  • Node.js 18+ installed
  • @azure/openai package installed
  • Azure OpenAI resource configured

Installation

npm install @azure/openai @decyra/sdk

Integration

Configure AzureOpenAI client with Decyra's proxy:

import { AzureOpenAI } from '@azure/openai';

const client = new AzureOpenAI({
  endpoint: 'https://proxy.decyra.com',
  apiKey: process.env.AZURE_OPENAI_API_KEY!,
  apiVersion: '2024-02-15-preview',
  defaultHeaders: {
    'X-Decyra-API-Key': process.env.DECYRA_API_KEY!,
  },
});

Use the client for chat completions:

const deploymentName = 'gpt-4';

const result = await client.chat.completions.create({
  model: deploymentName,
  messages: [
    { role: 'user', content: 'Explain microservices architecture' }
  ],
  temperature: 0.7,
});

console.log(result.choices[0].message.content);

What Gets Captured

FieldDescription
Deployment NameAzure deployment identifier
ModelUnderlying model name
TemperatureTemperature parameter
MessagesConversation messages array
Prompt HashHash of the prompt
Response TimeAPI call duration
Token UsageInput/output tokens
CostEstimated Azure cost

Verify

Navigate to your Decyra dashboard and check the traces page. All Azure OpenAI calls will be visible with deployment and model information.

Next Steps