OpenAI Provider Configuration
Overview
This guide explains how to configure OpenAI to work with Pay-i Instrumentation. OpenAI is a supported Provider with several Resources that Pay-i can track.
Prerequisites
To use Pay-i with OpenAI, you'll need:
-
OpenAI Service:
- OpenAI API key
- Models you want to use (see Pay-i Managed Categories and Resources for supported models)
-
Pay-i:
SDK Support
The examples in this guide use the Pay-i Python SDK, which provides comprehensive support for OpenAI integration. If you're using a different programming language, you can utilize the Pay-i OpenAPI specification to generate a client SDK for your language of choice. The core concepts remain the same, though the exact implementation details may vary depending on the language and client library used.
Basic Setup
First, install the required packages:
# Install the required packages
pip install payi openai python-dotenv httpx
Direct Provider Call with Telemetry (Default)
The standard approach for using Pay-i with OpenAI is to initialize Pay-i instrumentation while making API calls directly to OpenAI:
import os
from openai import OpenAI
from payi.lib.instrument import payi_instrument
# Initialize Pay-i instrumentation
payi_instrument()
# Configure standard OpenAI client
client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))
# Use the client normally
response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Hello, how are you?"}]
)
With this configuration, Pay-i automatically tracks API calls and calculates costs without adding any latency to your requests.
Optional Proxy Routing (For Block Limits)
If you need to implement Block
limits that prevent requests from being sent to the provider when a budget is exceeded, you'll need to use Pay-i's proxy configuration:
import os
from openai import OpenAI
from payi.lib.helpers import payi_openai_url, PayiHeaderNames
from payi.lib.instrument import payi_instrument
# Initialize Pay-i instrumentation with proxy mode
payi_instrument(config={"proxy": True})
# Configure OpenAI client to use Pay-i as a proxy
client = OpenAI(
api_key=os.getenv("OPENAI_API_KEY"),
base_url=payi_openai_url(), # Use Pay-i's URL as the base
default_headers={PayiHeaderNames.api_key: os.getenv("PAYI_API_KEY")} # Authenticate with Pay-i
)
# The payi_openai_url() helper function returns the Pay-i proxy URL with the path `/api/v1/proxy/openai/v1`
The key differences in the proxy configuration are:
- We use
payi_instrument(config={"proxy": True})
- We use
payi_openai_url()
instead of the direct OpenAI endpoint, which routes requests through the Pay-i proxy at/api/v1/proxy/openai/v1
- We include the Pay-i API key in the default headers
For detailed information about proxy configuration, including when to use it and how it works, see the Pay-i Proxy Configuration guide.
Advanced Configuration
Note: The following advanced configurations can be applied to both standard and proxy configurations. If you're using proxy configuration, remember to include the
base_url
anddefault_headers
parameters as shown in the proxy configuration example above.
Additional Client Configuration Options
The OpenAI client supports various configuration options for performance tuning, such as connection pooling and timeouts. For details on these options, refer to the OpenAI Python SDK documentation.
When using these options with Pay-i in proxy mode, ensure you include the Pay-i-specific configuration:
import os
from openai import OpenAI
import httpx
from payi.lib.helpers import payi_openai_url, PayiHeaderNames
# Configure OpenAI client with Pay-i proxy and additional options
client = OpenAI(
# Standard configuration
api_key=os.getenv("OPENAI_API_KEY"),
# Pay-i proxy configuration
base_url=payi_openai_url(),
default_headers={PayiHeaderNames.api_key: os.getenv("PAYI_API_KEY")},
# Additional configuration options as needed
# Refer to OpenAI SDK documentation for available options
http_client=httpx.Client(
# Example: timeout settings
timeout=60.0
)
)
Related Resources
Quickstart Guides
- OpenAI Quickstart - Single comprehensive example with instructions
- Azure OpenAI Quickstart - Similar example for Azure OpenAI
Conceptual Guides
- Auto-Instrumentation with GenAI Providers - Overview of all providers
- Custom Instrumentation - Adding business context to your tracking
- OpenAI Custom Headers - Annotating individual OpenAI requests with metadata
- Pay-i Proxy Configuration - For when you need
Block
limits
Example Repositories
- OpenAI Example Collection - Multiple code examples
- All Pay-i Quickstarts - Examples for all providers
Updated 15 days ago