Back to Guides

OpenRouter

Access multiple LLMs through a unified API

TypeScripteasy

OpenRouter provides access to multiple AI models through a unified API. Decyra captures all OpenRouter requests, giving you visibility across all models you use.

Prerequisites

  • Decyra account with API key
  • Node.js 18+ installed
  • openai package (OpenRouter compatible) installed

Installation

npm install openai @decyra/sdk

Integration

Configure OpenAI-compatible client for OpenRouter with Decyra:

import OpenAI from 'openai';
import { wrapOpenRouter } from '@decyra/sdk';

const openai = new OpenAI({
  baseURL: 'https://proxy.decyra.com/v1',
  apiKey: process.env.OPENROUTER_API_KEY!,
  defaultHeaders: {
    'X-Decyra-API-Key': process.env.DECYRA_API_KEY!,
    'HTTP-Referer': 'https://your-app.com',
    'X-Title': 'Your App Name',
  },
});

const decyraClient = wrapOpenRouter(openai);

Use the client for chat completions:

const completion = await decyraClient.chat.completions.create({
  model: 'anthropic/claude-3-opus',
  messages: [
    { role: 'user', content: 'Compare GPT-4 and Claude' }
  ],
});

console.log(completion.choices[0].message.content);

What Gets Captured

FieldDescription
ModelOpenRouter model identifier
ProviderUnderlying model provider
TemperatureTemperature parameter
MessagesConversation messages
Prompt HashHash of the prompt
Response TimeAPI call duration
Token UsageInput/output tokens
CostEstimated cost per request

Verify

Visit your Decyra dashboard and check the traces page. All OpenRouter requests will be visible with provider and model information.

Next Steps