MCP node
The MCP node (Model Context Protocol) lets you send data to an external model server and bring back structured results — think of it as a smart bridge between your flows and any AI-powered tool or custom model you’ve built.
Whether you're integrating LLMs, specialized inference tools, or parameterized remote services, this node gives you a flexible, powerful interface.
What can it do?
- Connect to an external server running an MCP-compatible tool
- Pass structured data and parameters to that tool
- Get back processed or generated results (text, predictions, labels, scores, etc.)
- Use it as a step in your data flow or analysis pipeline
How to use it
- Add the MCP node to your flow
- Enter the server URL (must support MCP)
- Select the tool you want to use on that server
- Provide a set of parameters (key-value pairs) the tool expects
- Connect the node to a source (for input) or visualize the output
Configuration
Field | Description |
---|---|
serverUrl | URL of the MCP server you're calling |
selectedTool | Name of the tool or endpoint you want to use |
parameters | JSON object with tool-specific configuration |
Use cases
- Send input text to an LLM for summarization or Q&A
- Run predictions using a hosted ML model
- Label or classify data via external services
- Augment rows with AI-generated output
Output
The node returns the response from the selected tool, typically as a table or enriched dataset, depending on how the server formats its reply.
Security
- Server URLs are stored locally in your flow
- No data is shared unless the flow connects to the external server
- Parameters are kept within your user scope
Connect your flow to models that think, generate, and adapt — directly from your canvas.