Source Code:
src/gaia/mcp/ (n8n uses the MCP Server)GAIA n8n Integration Guide
This guide explains how to integrate GAIA’s AI capabilities into n8n workflows using the GAIA MCP Server.Overview
n8n can leverage GAIA’s AI agents (LLM, Chat, Jira, Blender) through HTTP requests to the MCP server. This enables powerful AI-driven workflow automation without writing code.Architecture
How It Works
Natural Language Jira
Natural language Jira operations
AI Text Processing
AI-powered text processing and summarization
Conversational Chat
Conversational chat with context
3D Content Generation
3D content generation via Blender
Prerequisites
1. Set up GAIA MCP Server
Follow the MCP Server Documentation to:- Install GAIA with MCP support
- Start the Lemonade LLM server
- Start the MCP bridge server
- Verify it’s running with
gaia mcp status
2. Install and Run n8n Locally
- Using npx (No Installation)
- Global Installation
http://localhost:5678
Getting Started: Step-by-Step
Step 1: Verify MCP Server is Running
Before creating workflows, verify the MCP server is accessible:Step 2: Create Your First n8n Workflow
Available Endpoints
The MCP bridge provides the following HTTP endpoints:Health Check
- GET
/health- Basic health check - Returns:
{"status": "healthy", "service": "GAIA MCP Bridge (HTTP)", "agents": 4, "tools": 5}
Status (Detailed)
- GET
/status- Comprehensive status with all agents, tools, and endpoints - Returns: Detailed JSON with agents, tools, capabilities, and available endpoints
Tool Listing
- GET
/tools- List all available GAIA tools - Returns: Array of tool definitions with names, descriptions, and input schemas
Direct Agent Endpoints
- Jira Operations
- LLM Queries
- Chat Interface
POST The Jira agent automatically determines the appropriate operation based on the query.
/jira - Natural language Jira operationsJSON-RPC Endpoint
POST/ - Standard JSON-RPC 2.0 interface for MCP protocol
Supports methods:
initialize- Initialize MCP connectiontools/list- List available toolstools/call- Execute a specific tool
Integration Examples: From Simple to Complex
Example 1: Simple Health Check (GET Request)
Expected response:
Example 2: Get Detailed Status (GET Request)
Example 3: Simple Chat (POST Request)
Example 4: LLM Analysis (POST Request)
Example 5: Jira Agent - Natural Language Queries
Building Complete Workflows
Workflow 1: Daily Standup Assistant
Components: Manual Trigger → Jira Query → LLM Summarize → OutputWorkflow 2: Smart Issue Triage
Components: Webhook Trigger → LLM Analysis → Jira UpdateImport Pre-built Workflow
Tips for Building Workflows
Using Variables in Requests
n8n allows you to reference data from previous nodes:Error Handling
Rate Limiting
Testing Workflows
Understanding Responses
Successful Responses
Each endpoint returns different data structures:- GET Endpoints
- POST Endpoints
health, status, tools:
- Return status information directly
- No
successfield needed
Handling Errors
Best Practices
Start Simple
Begin with GET requests, then move to POST
Use localhost
Use
localhost for local server connectionsTest Incrementally
Test each node before building complex workflows
Handle Errors
Add IF nodes to check
success fieldSet Timeouts
Use 30s timeout for Jira/LLM operations
Log Responses
Use Set nodes to capture responses for debugging
Troubleshooting
MCP Bridge Not Accessible / "Service refused the connection"
MCP Bridge Not Accessible / "Service refused the connection"
If you get “The service refused the connection - perhaps it is offline” or
ECONNREFUSED ::1:8765 error:QUICK FIX: Ensure the MCP server is running with gaia mcp start and use http://localhost:8765 for connections.For detailed MCP server troubleshooting (ports, processes, Docker/WSL), see the MCP Server Documentation.n8n-Specific Connection Issues
n8n-Specific Connection Issues
When MCP works locally but n8n doesn’t:Connection refused issue (Most Common):b) Use ngrok tunnel (EASIEST for testing with n8n.cloud):c) Use localtunnel (Alternative to ngrok):d) Deploy MCP to cloud (for production):Test the correct URL first:
- Error:
connect ECONNREFUSED ::1:8765means the MCP server may not be running - FIX: Ensure MCP server is running with
gaia mcp startand usehttp://localhost:8765 - If issues persist, check firewall settings or try
http://127.0.0.1:8765as an alternative
- Deploy GAIA MCP to AWS/Azure/Heroku
- Use the public URL in n8n.cloud
Jira Operations Failing
Jira Operations Failing
Solutions:
- Verify Jira credentials are configured in GAIA
- Test with simple query first: “show 1 issue”
- Check Jira agent logs for detailed errors
LLM Queries Slow
LLM Queries Slow
Solutions:
- Ensure Lemonade server is running
- Check if model is loaded in memory
- Consider using smaller models for faster responses
n8n Connection Issues
n8n Connection Issues
Solutions:
- First try: Ensure MCP server is running and use
http://localhost:8765 - Verify n8n can reach the MCP bridge
- Check CORS settings if using browser-based n8n
- Test with curl first to isolate issues
Advanced Configuration
For custom ports and other advanced MCP server configuration options, see the MCP Server Documentation.Security Considerations
Example Production Setup
Using nginx reverse proxy:Support
- Documentation: See other GAIA docs in
/docs - Issues: Report issues on GitHub
- Examples: Check
/examplesfolder for more workflows
Next Steps
Set up MCP Server
Follow the MCP Documentation