Configuring n8n Workflows for Pay-i
Overview
Each Pay-i Node routes requests through the Pay-i proxy to a specific upstream provider. Requests and the responses are passed through transparently, and un-modified. This allows Pay-i to be your system of record for AI usage, while ensuring that proper attribution is captured without modifying the payload.
Common Properties
Every Pay-i node exposes the following properties within it's invidivual UI. You can set these to associate requests with Pay-i Use Cases, users, and Limits. These fields are the same across all of the providers included within the n8n-nodes-payipackage.
| Property | Description |
|---|---|
| Use Case Name | Associates each workflow execution with a Pay-i Use Case. If you do not specify this, the Pay-i node within n8n will use the workflow name |
| User ID | Per-user attribution within the Pay-i dashboard |
| Use Case Step | The step name from the n8n workflow canvas. If not defined, Pay-i will use whatever that step is called as it appears on your n8n canvas |
| Account Name | Per-account attribution within the Pay-i dashboard |
| Limit IDs | IDs of the Pay-i Limits - Currently this only supports reading limits |
| Properties | Custom key-value metadata for filtering and reporting within Pay-i |
To learn more on how Use Cases and Limits work, see Use Cases and Limits.
AI Providers Supported within n8n
Every Pay-i node will forward AI provider prompt properties (model name, temperature, etc) as-is.
OpenAI
| Node | Pay-i OpenAI Proxy / lmChatPayi |
| Credentials | PayiApi + openAIApi |
| Proxy Path | /api/v1/proxy/openai/v1 |
| LangChain Class | ChatOpenAI |
| Pay-i Supported Models | See: Category: system.openai(Link) |
Anthropic
| Node | Pay-i Anthriopic (Proxy) / lmChatPayi |
| Credentials | PayiApi + openAIApiAnthropic |
| Proxy Path | /api/v1/proxy/anthropic |
| LangChain Class | ChatAnthropic |
| Pay-i Supported Models | See: Category: system.anthropic(Link) |
Azure AI Foundry
| Node | Pay-i Azure AI Foundry (Proxy) / lmChatPayiAzure |
| Credentials | PayiApi + azureOpenAiApi |
| Proxy Path | /api/v1/proxy/azure.openai |
| LangChain Class | ChatOpenAI (Not AzureChatOpenAI will avoid authentication conflicts with the proxy. |
| Pay-i Supported Models | See: Category: system.azure(Link) |
- Scope: This node targets Azure OpenAI endpoints - the OpenAI comptable API surgace within Azure AI Foundry., You can use other models, but this node is specifically designed to handle Azure OpenAI deployments using the OpenAI API wire format.
- Required You must set the
Deployment Namefield to match your Azure deployment exactly. - Endpoint Resolution If your credentials have an explicit
endpointvalue, use that. Otherwise the endpoint is assembled usingresourceNameashttps://_[resourceName]_.openai.azure.com. The resolved endpoint will be sent asxProxy-Provider-BaseURI. - API Version Falls back in order: node parameter --> Credential value --> default (
2024-08-01-preview)
For additional information and Azure OpenAI specific guidance, see Instrumentation for Microsoft Foundry and Azure Open AI contact Pay-i Support if you need assitance configuring Provisioned Throughput Units.
AWS Bedrock
| Node | Pay-i Anthriopic (Proxy) / lmChatPayiBedrock |
| Credentials | PayiApi + aws |
| Proxy Path | /api/v1/proxy/aws.bedrock |
| LangChain Class | ChatBedRockConverse |
| Pay-i Supported Models | See: Category: system.aws.bedrock(Link) |
- Model Validation When using AWS Bedrock, you must set your Model to a validated Bedrock model ID. (for example,
us.anthropic.claude-sonnet-4-6) - Setting your Region The region defaults from the n8n credentials you set. If no region is specified, or Pay-i doesn't know how to hanlde it, Pay-i will default to
us-east-1 - Signing Limitation There are some considerations for code that uses the AWS sigV4 function. Please contact [email protected] if you experience issues or need guidance prior to deploying Bedrock services into production.
Databricks
| Node | Pay-i Databricks (Proxy) / lmChatPayiDatabricks |
| Credentials | PayiApi + databricks (Uses n8n-nodes-databricks) |
| Proxy Path | /api/v1/proxy/openai/v1 ( Reuses the OpenAI path) |
| LangChain Class | ChatOpenAI |
| Pay-i Supported Models | See: Category: system.databricks.[Hyperscaler](Link) |
- Cloud Provider Option Required You must select the Cloud Provider within the Pay-i node if you are using Databricks. AWS and Google use
[hostname].cloud.databricks.comas their endpoint but Azure's Managed Databricks Service uses[hostname].azuredatabricks.net. If this value is not set, your data will not show up correctly or you may find random errors. - Hyperscaler Support Pay-i currently supports AWS, Azure, and Google Cloud as providers for Databricks. Contact Pay-i support if you need other options.
Updated about 2 hours ago