Langchain Callback Handler
Overview
This guide demonstrates how to annotate LangChain operations with Pay-i. Unlike other provider SDKs that use custom headers, LangChain requires a different approach using Pay-i's dedicated callback handler.
Different Approach: Callbacks Instead of Headers
LangChain is an abstraction layer that sits above individual provider SDKs, so it doesn't directly manipulate HTTP headers. Instead, LangChain offers a callback system for extensions, which Pay-i leverages through a dedicated handler.
Basic Pattern
The basic pattern involves creating a Pay-i callback handler with your annotations and attaching it to your LangChain components:
from langchain_openai import ChatOpenAI
from payi.langchain import PayiCallbackHandler
from payi.lib.instrument import payi_instrument
# Initialize Pay-i instrumentation
payi_instrument()
# Create the Pay-i callback handler with your annotations
handler = PayiCallbackHandler(
client=None, # Pay-i client is optional, will use global client if None
params={
"limit_ids": ["langchain_budget"], # List of budgets to track
"request_tags": ["langchain", "chain"], # Tags for filtering
"user_id": "researcher_456", # User attribution
"use_case_name": "research_assistant" # Business use case
}
)
# Initialize the LLM with the Pay-i handler
llm = ChatOpenAI(
model="gpt-4",
temperature=0.7,
callbacks=[handler] # Attach the Pay-i handler here
)
# Use the LLM - Pay-i will automatically track and annotate
response = llm.invoke("Summarize the benefits of AI in healthcare")
Working with Chains
You can use the same handler with LangChain chains:
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate
from langchain_openai import ChatOpenAI
from payi.langchain import PayiCallbackHandler
# Create the handler with annotations
handler = PayiCallbackHandler(
params={
"use_case_name": "qa_chain",
"user_id": "support_agent_789",
"limit_ids": ["qa_budget"],
"request_tags": ["qa", "chain"]
}
)
# Set up the LLM with the handler
llm = ChatOpenAI(
model="gpt-4",
temperature=0.3,
callbacks=[handler]
)
# Create and use a chain with the same handler
prompt = PromptTemplate.from_template("Answer this question: {question}")
chain = LLMChain(llm=llm, prompt=prompt)
# The handler will track and annotate this chain execution
result = chain.invoke({"question": "What is the capital of France?"})
Working with Agents
For more complex applications like agents, the handler works the same way:
from langchain.agents import initialize_agent, AgentType
from langchain.tools import tool
from langchain_openai import ChatOpenAI
from payi.langchain import PayiCallbackHandler
# Define a simple tool
@tool
def search(query: str) -> str:
"""Search for information online."""
# Mock search function
return f"Results for: {query}"
# Create the handler with annotations
handler = PayiCallbackHandler(
params={
"use_case_name": "search_agent",
"user_id": "analyst_123",
"limit_ids": ["agent_budget"],
"request_tags": ["agent", "search"]
}
)
# Set up the LLM with the handler
llm = ChatOpenAI(
model="gpt-4",
temperature=0.2,
callbacks=[handler]
)
# Initialize the agent with tools
agent = initialize_agent(
[search],
llm,
agent=AgentType.OPENAI_FUNCTIONS,
verbose=True
)
# Run the agent - Pay-i will track all LLM calls made during execution
agent.invoke({"input": "What is the population of Tokyo?"})
Annotation Parameters
The params
dictionary in the PayiCallbackHandler constructor accepts the same annotation parameters that you would normally provide to create_headers()
:
Parameter | Type | Description |
---|---|---|
limit_ids | List[str] | List of limit IDs to track against (always provide as a list) |
request_tags | List[str] | List of tags to associate with the request |
use_case_name | str | Name of the use case for this request |
use_case_id | str | ID of an existing use case |
user_id | str | User ID to associate with the request |
Key Benefits
This approach allows you to:
- Track and annotate all LLM calls made through LangChain
- Apply consistent business context across your LangChain applications
- Integrate Pay-i's tracking and limits with LangChain's abstraction layer
For more information about annotations and how to use them across different providers, see the Custom Headers documentation.
Updated 13 days ago