Back to Guides

AWS Bedrock

Build generative AI applications with AWS Bedrock

Pythonmedium

AWS Bedrock provides access to foundation models from multiple providers. Decyra captures all Bedrock API calls, enabling observability across your AWS AI workloads.

Prerequisites

  • Decyra account with API key
  • Node.js 18+ installed
  • @aws-sdk/client-bedrock-runtime installed
  • AWS credentials configured

Installation

npm install @aws-sdk/client-bedrock-runtime @decyra/sdk

Integration

Wrap fetch to route through Decyra:

import { BedrockRuntimeClient, InvokeModelCommand } from '@aws-sdk/client-bedrock-runtime';
import { wrapFetch } from '@decyra/sdk';

const customFetch = wrapFetch(fetch, {
  baseURL: 'https://proxy.decyra.com',
  apiKey: process.env.DECYRA_API_KEY!,
});

const client = new BedrockRuntimeClient({
  region: 'us-east-1',
  requestHandler: {
    // Configure custom fetch handler if SDK supports it
  },
});

Invoke a model:

const command = new InvokeModelCommand({
  modelId: 'anthropic.claude-3-opus-20240229-v1:0',
  contentType: 'application/json',
  body: JSON.stringify({
    anthropic_version: 'bedrock-2023-05-31',
    max_tokens: 1024,
    messages: [
      { role: 'user', content: 'Explain serverless architecture' }
    ],
  }),
});

const response = await client.send(command);
const result = JSON.parse(new TextDecoder().decode(response.body));
console.log(result.content[0].text);

What Gets Captured

FieldDescription
Model IDBedrock model identifier
ProviderModel provider (Anthropic, AI21, etc.)
RegionAWS region used
Prompt HashHash of the input
Response TimeAPI call duration
Token UsageInput/output tokens
CostEstimated AWS cost
Request TypeInvokeModel or InvokeModelWithResponseStream

Verify

Check your Decyra dashboard to see Bedrock API calls in the traces view. Each invocation will include model and provider details.

Next Steps