Back to Guides

Vercel AI SDK

Build AI applications with React hooks and streaming support

TypeScripteasy

The Vercel AI SDK provides a unified interface for working with multiple AI providers. By routing requests through Decyra's proxy, you can capture all AI interactions, monitor usage, and implement rate limiting and cost controls.

Prerequisites

  • Decyra account with API key
  • Node.js 18+ installed
  • Vercel AI SDK package installed

Installation

npm install ai @decyra/sdk

Integration

Configure the OpenAI client to use Decyra's proxy:

import { createOpenAI } from '@ai-sdk/openai';
import { wrapOpenAI } from '@decyra/sdk';

const openai = createOpenAI({
  baseURL: 'https://proxy.decyra.com/v1',
  headers: {
    'X-Decyra-API-Key': process.env.DECYRA_API_KEY!,
  },
});

// Wrap with Decyra SDK for enhanced tracking
const decyraClient = wrapOpenAI(openai);

Use the client with generateText:

import { generateText } from 'ai';

const { text } = await generateText({
  model: decyraClient('gpt-4'),
  prompt: 'Explain quantum computing in simple terms',
  temperature: 0.7,
});

What Gets Captured

FieldDescription
ModelThe AI model used (e.g., gpt-4, gpt-3.5-turbo)
TemperatureSampling temperature parameter
Max TokensMaximum tokens in the response
Prompt HashSHA-256 hash of the input prompt
Response TimeTime taken for the API call
Token UsageInput and output token counts
CostEstimated cost based on model pricing

Verify

Check your Decyra dashboard to see the request appear in the traces view. You should see the model, prompt hash, token usage, and response time for each call.

Next Steps