Back to Guides

AutoGen

Enable next-generation LLM applications with multi-agent conversations

Pythonmedium

AutoGen enables building multi-agent conversation systems. Decyra captures all agent-to-agent and agent-to-LLM interactions, providing complete visibility into your AutoGen workflows.

Prerequisites

  • Decyra account with API key
  • Python 3.8+ installed
  • AutoGen package installed

Installation

pip install pyautogen decyra-sdk

Integration

Configure the model with Decyra's proxy in the config list:

import autogen
import os

config_list = [
    {
        'model': 'gpt-4',
        'api_key': os.getenv('DECYRA_API_KEY'),
        'base_url': 'https://proxy.decyra.com/v1',
        'api_type': 'open_ai',
    }
]

llm_config = {
    'config_list': config_list,
    'temperature': 0.7,
}

Create and use ConversableAgent:

assistant = autogen.AssistantAgent(
    name="assistant",
    llm_config=llm_config,
    system_message="You are a helpful assistant."
)

user_proxy = autogen.UserProxyAgent(
    name="user_proxy",
    human_input_mode="NEVER",
    max_consecutive_auto_reply=10,
)

user_proxy.initiate_chat(
    assistant,
    message="Write a Python function to calculate fibonacci numbers"
)

What Gets Captured

FieldDescription
ModelThe AI model identifier
TemperatureTemperature setting
Agent NameName of the agent making the call
Conversation TurnTurn number in the conversation
Prompt HashHash of the conversation context
Response TimeTime for each agent response
Token UsageTokens used per interaction
CostCost per agent interaction

Verify

Visit your Decyra dashboard to see all agent interactions. Each conversation turn will appear as a separate trace with full context.

Next Steps