Back to Guides

Semantic Kernel

Integrate LLMs with your .NET applications

C#medium

Semantic Kernel enables building AI applications with plugins and planners. Decyra captures all kernel operations, providing visibility into your AI orchestration workflows.

Prerequisites

  • Decyra account with API key
  • Python 3.8+ installed
  • Semantic Kernel package installed

Installation

pip install semantic-kernel decyra-sdk

Integration

Configure OpenAI with custom endpoint:

import semantic_kernel as sk
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion
import os

kernel = sk.Kernel()

service = OpenAIChatCompletion(
    service_id="openai",
    ai_model_id="gpt-4",
    api_key=os.getenv("DECYRA_API_KEY"),
    endpoint="https://proxy.decyra.com/v1/chat/completions",
    default_headers={
        "X-Decyra-API-Key": os.getenv("DECYRA_API_KEY")
    }
)

kernel.add_service(service)

Use the kernel for prompts:

prompt = kernel.create_function_from_prompt(
    function_name="chat",
    prompt="You are a helpful assistant. {{$input}}"
)

result = await kernel.invoke(prompt, input="Explain AI")
print(result)

What Gets Captured

FieldDescription
ModelThe AI model identifier
TemperatureTemperature setting
Function NameName of the kernel function
Prompt TemplateThe prompt template used
Prompt HashHash of rendered prompt
Response TimeTime for kernel execution
Token UsageTokens consumed
CostEstimated API cost

Verify

Visit your Decyra dashboard to see kernel operations in the traces view. Each function invocation will appear with full context.

Next Steps