-
Notifications
You must be signed in to change notification settings - Fork 28.4k
AI Agent with MCP/Tools Fails with Custom Base URL (Cloudflare OpenAI-Compliant API), #15862
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hey @ankitdalalx, We have created an internal ticket to look into this which we will be tracking as "GHC-2261" |
Hi @ankitdalalx, trying to understand what's going on here. Does it properly call tools when you're running local tools (like a normal calculator), or does this only happen with MCP connections? Also, could you try out a different model (a bigger model like gpt-4.1)? I'm curious to understand what causes this issue. |
Hi @Joffcom @jeanpaul @tcurdt @ceefour Thanks for looking into this. Let me clarify a few points: Model Usage: I am not using OpenAI models directly (like GPT-4.1). The issue occurs when using models provided by Cloudflare Workers AI, accessed via their OpenAI-compliant REST API and a custom base URL in n8n. The specific models I've tested, which are documented as being capable of function/tool calling, include:
Tool Behavior (Local Tools vs. MCP): The problem is not specific to MCP connections. Neither local tools (like the standard Calculator tool) nor MCP clients work correctly when the AI Agent is configured with the custom Cloudflare base URL. As shown in my original report (with the "Use Calculator tool 4+4" example), when using a local tool like the Calculator, the AI Agent outputs the tool call definition ({"type": "function", "name": "calculator", "parameters": {"input": "4+4"}}) instead of executing the tool and returning its result. The same behavior occurs with MCP clients. The core issue seems to be that while the AI Agent, using these Cloudflare models via a custom base URL, correctly identifies that a tool needs to be used and even formulates the correct parameters for the tool call, it fails to proceed with the actual execution of the tool and the subsequent processing of the tool's output. Instead, it just returns the intended tool call as its final response. This is different from how models from OpenAI (when used directly) or other similar setups typically handle tool integration |
@Joffcom hey any thoughts? |
Bug Description
The AI Agent in n8n, when configured with a custom base URL pointing to an OpenAI-compliant API (specifically Cloudflare Workers AI REST API for models), fails to correctly utilize attached MCP (Model Control Plane) clients or Tools.
While the AI Agent seems aware of the available tools and even appears to formulate a request for the tool, this request payload itself becomes the final output of the AI Agent node, instead of the tool's actual response or a subsequent generation based on that response. This issue prevents the successful execution of workflows that rely on AI Agents using tools with a custom OpenAI-compatible endpoint.
Notably, the standard AI Agent (without a custom base URL, presumably using a direct OpenAI connection) with tools and memory functions correctly. The problem specifically arises when a custom base URL is introduced in conjunction with tools/MCP.
To Reproduce
Steps to reproduce the behavior:
Configure an n8n instance.
Import the workflow JSON provided below.
Ensure you have credentials configured for:
An OpenAI-compatible API (e.g., Cloudflare using your own token and base URL) for the "OpenAI Chat Model" node. The model used in the example is @cf/meta/llama-3.3-70b-instruct-fp8-fast.
Postgres for the "Postgres Chat Memory" node (e.g., Supabase).
(If applicable) Any credentials required for the "MCP Client" tool (example uses https://mcp.kite.trade/sse).
Trigger the "When chat message received" node with an input like: Use Calculator tool 4+4 (or any prompt that should invoke an attached tool).
Observe the output of the "AI Agent" node.
Workflow (JSON):
JSON
Expected behavior
The AI Agent should successfully call the tool via the custom base URL (e.g., Calculator tool with input "4+4"). The tool should execute (e.g., calculate 4+4 = 8), and its response should be processed by the AI Agent, leading to a final output that incorporates the tool's result (e.g., "The result is 8.").
Actual Behavior:
(This section was added as it's crucial for understanding the bug)
The "AI Agent" node outputs the JSON representation of the function call it intended to make to the tool, rather than the tool's response or a subsequent generation based on the tool's response.
For an input like Use Calculator tool 4+4, the output from the AI Agent node is:
JSON
This indicates the agent identified the correct tool and parameters but failed to execute it and process its output when using the custom base URL.
Additional Context/Debug Information:
The custom base URL is for Cloudflare Workers AI REST API for models, which is OpenAI compliant.
Normal AI Agent functionality (without a custom base URL) using tools and memory works as expected.
The issue persists across multiple n8n versions (implied by "tried multiple version"), different nodes, different models, and system prompts.
The core problem seems to be specifically tied to the combination of a custom base URL and tool/MCP usage in the AI Agent.
Full Debug Info:
Generated at: 2025-05-30T09:37:05.854Z
Operating System
docker (self-hosted)
n8n Version
1.94.1
Node.js Version
20.19.1
Database
SQLite (default)
Execution mode
main (default)
The text was updated successfully, but these errors were encountered: