Back to Guides

LangChain Python

Build LLM-powered applications with Python

Pythonmedium

LangChain Python provides a framework for developing applications powered by language models. Route your LLM calls through Decyra to gain comprehensive observability and control.

Prerequisites

  • Decyra account with API key
  • Python 3.8+ installed
  • LangChain packages installed

Installation

pip install langchain langchain-openai decyra-sdk

Integration

Configure ChatOpenAI with Decyra's proxy:

from langchain_openai import ChatOpenAI
from decyra import wrap_openai

llm = ChatOpenAI(
    model="gpt-4",
    base_url="https://proxy.decyra.com/v1",
    default_headers={
        "X-Decyra-API-Key": os.getenv("DECYRA_API_KEY")
    }
)

# Wrap for enhanced tracking
decyra_llm = wrap_openai(llm)

Create and invoke a chain:

from langchain.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant"),
    ("human", "{input}")
])

chain = prompt | decyra_llm

result = chain.invoke({"input": "Explain machine learning"})
print(result.content)

What Gets Captured

FieldDescription
ModelThe AI model name
TemperatureTemperature parameter value
Max TokensMaximum tokens limit
Prompt HashSHA-256 hash of prompt
ToolsAvailable function calling tools
Response TimeAPI call duration
Token UsageInput/output token counts
CostEstimated cost per request

Verify

Open your Decyra dashboard and navigate to the traces section. You'll see each chain invocation with complete details about the model interaction.

Next Steps