Back to Guides

LangChain.js

Orchestrate chains of LLM calls with TypeScript

TypeScriptmedium

LangChain JS enables building applications with language models through composable chains. Decyra's proxy captures all LLM interactions, providing visibility into your LangChain workflows.

Prerequisites

  • Decyra account with API key
  • Node.js 18+ installed
  • LangChain packages installed

Installation

npm install @langchain/openai @decyra/sdk langchain

Integration

Configure ChatOpenAI to use Decyra's proxy:

import { ChatOpenAI } from '@langchain/openai';
import { wrapOpenAI } from '@decyra/sdk';

const model = new ChatOpenAI({
  modelName: 'gpt-4',
  configuration: {
    baseURL: 'https://proxy.decyra.com/v1',
    defaultHeaders: {
      'X-Decyra-API-Key': process.env.DECYRA_API_KEY!,
    },
  },
});

// Wrap for enhanced tracking
const decyraModel = wrapOpenAI(model);

Create and invoke a chain:

import { ChatPromptTemplate } from '@langchain/core/prompts';

const prompt = ChatPromptTemplate.fromMessages([
  ['system', 'You are a helpful assistant'],
  ['human', '{input}'],
]);

const chain = prompt.pipe(decyraModel);

const result = await chain.invoke({
  input: 'What is the capital of France?',
});

What Gets Captured

FieldDescription
ModelThe AI model identifier
TemperatureSampling temperature setting
Max TokensMaximum response tokens
Prompt HashHash of the full prompt chain
ToolsList of tools/functions available
Response TimeTotal chain execution time
Token UsageInput/output token counts
CostEstimated API cost

Verify

Navigate to your Decyra dashboard and check the traces page. Each chain invocation will appear as a trace with full context about the model call.

Next Steps