Configuring n8n Workflows for Pay-i

Overview

Each Pay-i Node routes requests through the Pay-i proxy to a specific upstream provider. Requests and the responses are passed through transparently, and un-modified. This allows Pay-i to be your system of record for AI usage, while ensuring that proper attribution is captured without modifying the payload.

Common Properties

Every Pay-i node exposes the following properties within it's invidivual UI. You can set these to associate requests with Pay-i Use Cases, users, and Limits. These fields are the same across all of the providers included within the n8n-nodes-payipackage.


PropertyDescription
Use Case NameAssociates each workflow execution with a Pay-i Use Case. If you do not specify this, the Pay-i node within n8n will use the workflow name
User IDPer-user attribution within the Pay-i dashboard
Use Case StepThe step name from the n8n workflow canvas. If not defined, Pay-i will use whatever that step is called as it appears on your n8n canvas
Account NamePer-account attribution within the Pay-i dashboard
Limit IDsIDs of the Pay-i Limits - Currently this only supports reading limits
PropertiesCustom key-value metadata for filtering and reporting within Pay-i

To learn more on how Use Cases and Limits work, see Use Cases and Limits.

AI Providers Supported within n8n

Every Pay-i node will forward AI provider prompt properties (model name, temperature, etc) as-is.

OpenAI

NodePay-i OpenAI Proxy / lmChatPayi
CredentialsPayiApi + openAIApi
Proxy Path/api/v1/proxy/openai/v1
LangChain ClassChatOpenAI
Pay-i Supported ModelsSee: Category: system.openai(Link)

Anthropic

NodePay-i Anthriopic (Proxy) / lmChatPayi
CredentialsPayiApi + openAIApiAnthropic
Proxy Path/api/v1/proxy/anthropic
LangChain ClassChatAnthropic
Pay-i Supported ModelsSee: Category: system.anthropic(Link)

Azure AI Foundry

NodePay-i Azure AI Foundry (Proxy) / lmChatPayiAzure
CredentialsPayiApi + azureOpenAiApi
Proxy Path/api/v1/proxy/azure.openai
LangChain ClassChatOpenAI (Not AzureChatOpenAI will avoid authentication conflicts with the proxy.
Pay-i Supported ModelsSee: Category: system.azure(Link)
  • Scope: This node targets Azure OpenAI endpoints - the OpenAI comptable API surgace within Azure AI Foundry., You can use other models, but this node is specifically designed to handle Azure OpenAI deployments using the OpenAI API wire format.
  • Required You must set the Deployment Name field to match your Azure deployment exactly.
  • Endpoint Resolution If your credentials have an explicit endpointvalue, use that. Otherwise the endpoint is assembled using resourceName as https://_[resourceName]_.openai.azure.com. The resolved endpoint will be sent as xProxy-Provider-BaseURI.
  • API Version Falls back in order: node parameter --> Credential value --> default (2024-08-01-preview)

For additional information and Azure OpenAI specific guidance, see Instrumentation for Microsoft Foundry and Azure Open AI contact Pay-i Support if you need assitance configuring Provisioned Throughput Units.

AWS Bedrock

NodePay-i Anthriopic (Proxy) / lmChatPayiBedrock
CredentialsPayiApi + aws
Proxy Path/api/v1/proxy/aws.bedrock
LangChain ClassChatBedRockConverse
Pay-i Supported ModelsSee: Category: system.aws.bedrock(Link)
  • Model Validation When using AWS Bedrock, you must set your Model to a validated Bedrock model ID. (for example, us.anthropic.claude-sonnet-4-6)
  • Setting your Region The region defaults from the n8n credentials you set. If no region is specified, or Pay-i doesn't know how to hanlde it, Pay-i will default to us-east-1
  • Signing Limitation There are some considerations for code that uses the AWS sigV4 function. Please contact [email protected] if you experience issues or need guidance prior to deploying Bedrock services into production.

Databricks

NodePay-i Databricks (Proxy) / lmChatPayiDatabricks
CredentialsPayiApi + databricks (Uses n8n-nodes-databricks)
Proxy Path/api/v1/proxy/openai/v1 ( Reuses the OpenAI path)
LangChain ClassChatOpenAI
Pay-i Supported ModelsSee: Category: system.databricks.[Hyperscaler](Link)
  • Cloud Provider Option Required You must select the Cloud Provider within the Pay-i node if you are using Databricks. AWS and Google use [hostname].cloud.databricks.com as their endpoint but Azure's Managed Databricks Service uses [hostname].azuredatabricks.net. If this value is not set, your data will not show up correctly or you may find random errors.
  • Hyperscaler Support Pay-i currently supports AWS, Azure, and Google Cloud as providers for Databricks. Contact Pay-i support if you need other options.