Anthropic Provider Configuration
Overview
This guide explains how to configure Anthropic to work with Pay-i Instrumentation. Anthropic is a supported Provider with several Resources that Pay-i can track.
Prerequisites
To use Pay-i with Anthropic, you'll need:
-
Anthropic Service:
- Anthropic API key
- Models you want to use (see Pay-i Managed Categories and Resources for supported models)
-
Pay-i:
SDK Support
The examples in this guide use the Pay-i Python SDK, which provides comprehensive support for Anthropic integration. If you're using a different programming language, you can utilize the Pay-i OpenAPI specification to generate a client SDK for your language of choice. The core concepts remain the same, though the exact implementation details may vary depending on the language and client library used.
Basic Setup
First, install the required packages:
# Install the required packages
pip install payi anthropic python-dotenv
Then, set up your environment variables in a .env
file:
# .env file
ANTHROPIC_API_KEY=your_anthropic_api_key
PAYI_API_KEY=your_payi_api_key
Then in your Python code, load these environment variables and import the necessary libraries:
import os
from dotenv import load_dotenv
import anthropic
from payi.lib.instrument import payi_instrument
from payi.lib.helpers import create_headers
# Load environment variables from .env file
load_dotenv()
Direct Provider Call with Telemetry (Default)
The standard approach makes API calls directly to Anthropic while Pay-i tracks usage:
# Initialize Pay-i instrumentation
payi_instrument()
# Configure direct Anthropic client - automatically uses ANTHROPIC_API_KEY environment variable
client = anthropic.Anthropic()
# Make a request with annotations
message = client.messages.create(
model="claude-3-sonnet-20240229",
max_tokens=1000,
messages=[{"role": "user", "content": "Hello, how are you?"}],
headers=create_headers(
use_case_name="anthropic_example",
user_id="example_user",
limit_ids=["anthropic_limit"],
request_tags=["anthropic-example"]
)
)
print(message.content[0].text)
With this configuration, Pay-i automatically tracks API calls and calculates costs without adding any latency to your requests. You can also use decorators for tracking multiple related API calls - see Custom Instrumentation for more details.
Optional Proxy Routing (For Block Limits)
If you need to implement Block
limits that prevent requests from being sent to the provider when a budget is exceeded, use Pay-i's proxy configuration. Here's what changes compared to the standard approach:
from payi.lib.helpers import payi_anthropic_url, PayiHeaderNames
import json
import os
# Get the API key from environment variables
PAYI_API_KEY = os.getenv("PAYI_API_KEY")
# 1. Initialize with proxy mode enabled
payi_instrument(config={"proxy": True})
# 2. Configure Anthropic client to use Pay-i as a proxy
client = anthropic.Anthropic(
base_url=payi_anthropic_url(), # Use Pay-i's URL instead of direct endpoint
default_headers={PayiHeaderNames.api_key: PAYI_API_KEY} # Include Pay-i proxy headers
)
# The payi_anthropic_url() helper function returns the Pay-i proxy URL with the path `/api/v1/proxy/anthropic`
# Use the client normally
message = client.messages.create(
model="claude-3-sonnet-20240229",
max_tokens=1000,
messages=[{"role": "user", "content": "Hello, how are you?"}]
)
print(message.content[0].text)
# With proxy mode, you get access to real-time cost information
xproxy_result = message.xproxy_result
print(json.dumps(xproxy_result, indent=4))
The key differences in the proxy configuration are:
- We use
payi_instrument(config={"proxy": True})
- We use
payi_anthropic_url()
instead of the direct Anthropic endpoint, which routes requests through the Pay-i proxy at/api/v1/proxy/anthropic
- We include the Pay-i API key in the default headers
For detailed information about proxy configuration, including when to use it and how it works, see the Pay-i Proxy Configuration guide.
Advanced Configuration
Note: The following advanced configurations can be applied to both standard and proxy configurations. If you're using proxy configuration, remember to include the
base_url=payi_anthropic_url()
anddefault_headers
parameters as shown in the proxy configuration example above.
Additional Client Configuration Options
The Anthropic client supports various configuration options for performance tuning, such as connection pooling and timeouts. For details on these options, refer to the Anthropic Python SDK documentation.
When using these options with Pay-i in proxy mode, ensure you include the Pay-i-specific configuration:
# Configure Anthropic client with Pay-i proxy and additional options
client = anthropic.Anthropic(
# Pay-i proxy configuration
base_url=payi_anthropic_url(),
default_headers={PayiHeaderNames.api_key: PAYI_API_KEY},
# Additional Anthropic-specific configuration options as needed
# Refer to Anthropic documentation for available options
)
Related Resources
Quickstart Guides
- Anthropic Quickstart Examples - Code examples for Anthropic
Conceptual Guides
- Auto-Instrumentation with GenAI Providers - Overview of all providers
- Custom Instrumentation - Adding business context to your tracking
- Anthropic Custom Headers - Annotating individual Anthropic requests with metadata
- Pay-i Proxy Configuration - For when you need
Block
limits
Example Repositories
- Anthropic Example Collection - Multiple code examples
- All Pay-i Quickstarts - Examples for all providers
Updated 15 days ago