Vercel AI SDK
Build AI applications with React hooks and streaming support
TypeScripteasy
The Vercel AI SDK provides a unified interface for working with multiple AI providers. By routing requests through Decyra's proxy, you can capture all AI interactions, monitor usage, and implement rate limiting and cost controls.
Prerequisites
- Decyra account with API key
- Node.js 18+ installed
- Vercel AI SDK package installed
Installation
npm install ai @decyra/sdk
Integration
Configure the OpenAI client to use Decyra's proxy:
import { createOpenAI } from '@ai-sdk/openai';
import { wrapOpenAI } from '@decyra/sdk';
const openai = createOpenAI({
baseURL: 'https://proxy.decyra.com/v1',
headers: {
'X-Decyra-API-Key': process.env.DECYRA_API_KEY!,
},
});
// Wrap with Decyra SDK for enhanced tracking
const decyraClient = wrapOpenAI(openai);
Use the client with generateText:
import { generateText } from 'ai';
const { text } = await generateText({
model: decyraClient('gpt-4'),
prompt: 'Explain quantum computing in simple terms',
temperature: 0.7,
});
What Gets Captured
| Field | Description |
|---|---|
| Model | The AI model used (e.g., gpt-4, gpt-3.5-turbo) |
| Temperature | Sampling temperature parameter |
| Max Tokens | Maximum tokens in the response |
| Prompt Hash | SHA-256 hash of the input prompt |
| Response Time | Time taken for the API call |
| Token Usage | Input and output token counts |
| Cost | Estimated cost based on model pricing |
Verify
Check your Decyra dashboard to see the request appear in the traces view. You should see the model, prompt hash, token usage, and response time for each call.