Back to Guides

Haystack

Build production-ready NLP applications with Python

Pythonmedium

Haystack enables building production-ready NLP pipelines. Decyra captures all LLM interactions within your Haystack pipelines, providing visibility into document processing and generation.

Prerequisites

  • Decyra account with API key
  • Python 3.8+ installed
  • Haystack package installed

Installation

pip install haystack-ai decyra-sdk

Integration

Configure OpenAIGenerator with Decyra's proxy:

from haystack.components.generators import OpenAIGenerator
import os

generator = OpenAIGenerator(
    api_key=os.getenv("DECYRA_API_KEY"),
    model="gpt-4",
    api_base_url="https://proxy.decyra.com/v1",
    generation_kwargs={
        "temperature": 0.7,
    },
    headers={
        "X-Decyra-API-Key": os.getenv("DECYRA_API_KEY")
    }
)

Create and run a pipeline:

from haystack import Pipeline
from haystack.components.builders import PromptBuilder

prompt_template = """
Answer the following question: {{ question }}
"""

prompt_builder = PromptBuilder(template=prompt_template)

pipeline = Pipeline()
pipeline.add_component("prompt_builder", prompt_builder)
pipeline.add_component("llm", generator)
pipeline.connect("prompt_builder", "llm")

result = pipeline.run({
    "prompt_builder": {"question": "What is AI?"}
})

What Gets Captured

FieldDescription
ModelThe AI model used
TemperatureTemperature parameter
Pipeline ComponentName of the component
Prompt TemplateTemplate used for generation
Prompt HashHash of rendered prompt
Response TimeTime for pipeline execution
Token UsageInput/output tokens
CostEstimated API cost

Verify

Check your Decyra dashboard to see pipeline executions in the traces view. Each component interaction will be tracked separately.

Next Steps