Guides

Azure OpenAI Custom Headers

Overview

This guide demonstrates how to use Pay-i's custom headers with Azure OpenAI's Python SDK to annotate your API requests with important metadata.

Basic Pattern

Azure OpenAI's Python SDK supports Pay-i annotations through the extra_headers parameter, similar to the standard OpenAI SDK:

import os
from azure.openai import AzureOpenAI
from payi.lib.helpers import create_headers

# Initialize Azure OpenAI client
client = AzureOpenAI(
    api_key=os.getenv("AZURE_OPENAI_API_KEY"),
    api_version="2024-02-15-preview",
    azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT")
)

# Create headers with your annotations
request_headers = create_headers(
    use_case_name="text_classification",
    user_id="analyst_789",
    limit_ids=["azure_project_budget"],
    request_tags=["classification", "analytics"]
)

# Apply the headers to your API call
response = client.chat.completions.create(
    deployment_name="gpt-4",  # Your Azure OpenAI deployment name
    messages=[
        {"role": "system", "content": "You are a classification assistant."},
        {"role": "user", "content": "Classify this text into categories..."}
    ],
    extra_headers=request_headers  # Apply the annotations here
)

Proxy Routing Example

When using Proxy Routing with Azure OpenAI, you'll need to set up the client to route through Pay-i:

import os
from azure.openai import AzureOpenAI
from payi.lib.helpers import payi_azure_openai_url, create_headers

# API keys (use environment variables in production)
azure_key = os.getenv("AZURE_OPENAI_API_KEY")
payi_key = os.getenv("PAYI_API_KEY")
azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT")

# Configure Azure OpenAI client to use Pay-i as a proxy
client = AzureOpenAI(
    api_key=azure_key,
    api_version="2024-02-15-preview",
    azure_endpoint=payi_azure_openai_url(azure_endpoint),  # Route through Pay-i
    default_headers={"xProxy-api-key": payi_key}  # Authenticate with Pay-i
)

# Create headers for this specific API call
request_headers = create_headers(
    use_case_name="corporate_chatbot",
    user_id="employee_123",
    limit_ids=["hr_department_budget"],
    request_tags=["chatbot", "hr", "internal"]
)

# Make the API call with custom headers
response = client.chat.completions.create(
    deployment_name="your-deployment-name",
    messages=[
        {"role": "system", "content": "You are a helpful HR assistant."},
        {"role": "user", "content": "What is our company policy on remote work?"}
    ],
    extra_headers=request_headers  # Apply the annotations here
)

# Access Pay-i tracking information from the response
print(f"Request ID: {response.xproxy_result['request_id']}")
print(f"Cost: {response.xproxy_result['cost']}")

Embeddings Example

You can also use custom headers with Azure OpenAI's embeddings API:

import os
from azure.openai import AzureOpenAI
from payi.lib.helpers import create_headers

client = AzureOpenAI(
    api_key=os.getenv("AZURE_OPENAI_API_KEY"),
    api_version="2024-02-15-preview",
    azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT")
)

headers = create_headers(
    use_case_name="semantic_search",
    user_id="data_scientist_456",
    limit_ids=["embeddings_budget"],
    request_tags=["embeddings", "search"]
)

# Get embeddings with annotations
response = client.embeddings.create(
    input=["The quick brown fox jumps over the lazy dog"],
    deployment_name="text-embedding-ada-002",  # Your embeddings deployment name
    extra_headers=headers  # Add annotations here
)

print(f"Embedding dimension: {len(response.data[0].embedding)}")

For more information about custom headers and how to use them across different providers, see the Custom Headers documentation.