OpenRouter
Access multiple LLMs through a unified API
TypeScripteasy
OpenRouter provides access to multiple AI models through a unified API. Decyra captures all OpenRouter requests, giving you visibility across all models you use.
Prerequisites
- Decyra account with API key
- Node.js 18+ installed
- openai package (OpenRouter compatible) installed
Installation
npm install openai @decyra/sdk
Integration
Configure OpenAI-compatible client for OpenRouter with Decyra:
import OpenAI from 'openai';
import { wrapOpenRouter } from '@decyra/sdk';
const openai = new OpenAI({
baseURL: 'https://proxy.decyra.com/v1',
apiKey: process.env.OPENROUTER_API_KEY!,
defaultHeaders: {
'X-Decyra-API-Key': process.env.DECYRA_API_KEY!,
'HTTP-Referer': 'https://your-app.com',
'X-Title': 'Your App Name',
},
});
const decyraClient = wrapOpenRouter(openai);
Use the client for chat completions:
const completion = await decyraClient.chat.completions.create({
model: 'anthropic/claude-3-opus',
messages: [
{ role: 'user', content: 'Compare GPT-4 and Claude' }
],
});
console.log(completion.choices[0].message.content);
What Gets Captured
| Field | Description |
|---|---|
| Model | OpenRouter model identifier |
| Provider | Underlying model provider |
| Temperature | Temperature parameter |
| Messages | Conversation messages |
| Prompt Hash | Hash of the prompt |
| Response Time | API call duration |
| Token Usage | Input/output tokens |
| Cost | Estimated cost per request |
Verify
Visit your Decyra dashboard and check the traces page. All OpenRouter requests will be visible with provider and model information.