Azure OpenAI
Use OpenAI models through Azure's enterprise platform
Pythonmedium
Azure OpenAI provides managed access to OpenAI models. Decyra captures all Azure OpenAI API calls, providing observability and control over your Azure AI workloads.
Prerequisites
- Decyra account with API key
- Node.js 18+ installed
- @azure/openai package installed
- Azure OpenAI resource configured
Installation
npm install @azure/openai @decyra/sdk
Integration
Configure AzureOpenAI client with Decyra's proxy:
import { AzureOpenAI } from '@azure/openai';
const client = new AzureOpenAI({
endpoint: 'https://proxy.decyra.com',
apiKey: process.env.AZURE_OPENAI_API_KEY!,
apiVersion: '2024-02-15-preview',
defaultHeaders: {
'X-Decyra-API-Key': process.env.DECYRA_API_KEY!,
},
});
Use the client for chat completions:
const deploymentName = 'gpt-4';
const result = await client.chat.completions.create({
model: deploymentName,
messages: [
{ role: 'user', content: 'Explain microservices architecture' }
],
temperature: 0.7,
});
console.log(result.choices[0].message.content);
What Gets Captured
| Field | Description |
|---|---|
| Deployment Name | Azure deployment identifier |
| Model | Underlying model name |
| Temperature | Temperature parameter |
| Messages | Conversation messages array |
| Prompt Hash | Hash of the prompt |
| Response Time | API call duration |
| Token Usage | Input/output tokens |
| Cost | Estimated Azure cost |
Verify
Navigate to your Decyra dashboard and check the traces page. All Azure OpenAI calls will be visible with deployment and model information.