Source Code:
cpp/ in the GAIA repository.Component:
gaia::Agent base class and supporting libraries
Language: C++17
Build system: CMake 3.14+
Dependencies: nlohmann/json, cpp-httplib, Google Test (all fetched automatically)Overview
The GAIA C++17 framework is a native implementation of the core agent system. It compiles to a standalone binary with no interpreter overhead or runtime dependencies. What it provides:- Agent execution loop with state machine (planning, tool execution, error recovery, completion)
- Tool registry with
registerTool()for defining agent capabilities - MCP client with cross-platform stdio transport
- JSON response parsing with multi-strategy fallback (code-block extraction, bracket matching, syntax fixing)
- Console output with ANSI colors (
TerminalConsole,CleanConsole) and silent mode (SilentConsole)
The C++ framework focuses on the core agent loop and tool execution. Specialized agents (Code, Docker, Jira, Blender), the REST API server, RAG, and audio are available in the Python SDK.
Agent Execution Flow
- User query enters
agent.processQuery() - Agent composes a system prompt (tool descriptions + response format) and sends it to the LLM via HTTP
- LLM returns a JSON plan with tool calls
- Agent executes each tool — either a locally registered C++ callback or a remote MCP tool via JSON-RPC 2.0 over stdio
- Tool results feed back to the LLM for further reasoning
- Loop repeats until the LLM produces a final answer or the step limit is reached
Reactive Agent Loop
The agent is not a script. After every tool execution, the LLM is called again with the full conversation so far — including the tool’s output. This lets the model reason about results and change course. Each loop iteration: LLM reasons → agent executes → result fed back → LLM reasons again. The LLM can skip steps, add new ones, or pivot strategy at any point.How It Works
Thehealth_agent demo is a Windows System Health Agent. It subclasses gaia::Agent, connects to the Windows MCP server on startup, then enters the planning loop.
Wi-Fi Troubleshooter Demo
Thewifi_agent demonstrates adaptive reasoning without MCP — all tools are registered directly in C++ as PowerShell commands. This showcases how an agent differs from a script: it reasons about each result, skips irrelevant steps, applies fixes, and verifies outcomes.
Key features:
- Structured reasoning — LLM outputs
FINDING:andDECISION:prefixes, displayed with color-coded labels in the TUI - Adaptive behavior — skips downstream checks if adapter is disconnected, adds fix/verify steps when issues are found
- Real tools — all diagnostics (
netsh,ipconfig,Test-NetConnection) and fixes (flush DNS,toggle Wi-Fi radio,restart adapter) execute real PowerShell commands - GPU/NPU selection — choose between GGUF (GPU) and FLM (NPU) model backends at startup
- Admin detection — warns on startup if fix tools won’t work without elevation
How Tools Are Implemented
Tools are C++ lambdas registered withToolRegistry. The Wi-Fi agent’s tools wrap PowerShell commands via a runShell() helper that uses _popen() to spawn a PowerShell subprocess:
netsh, ipconfig, Test-NetConnection). For complex operations like the WinRT Radio API, the tool writes a temporary .ps1 script and runs it via powershell -File.
Structured Reasoning Display
The system prompt instructs the LLM to prefix its reasoning withFINDING: and DECISION:. The custom CleanConsole output handler parses these and displays them with color-coded labels:
- Finding (green) — what the diagnostic data shows
- Decision (yellow) — what the agent will do next and why
Writing Your Own Agent
Subclassgaia::Agent, override getSystemPrompt() and optionally registerTools(), then call init() at the end of your constructor:
Why
init() in the constructor? C++ virtual dispatch does not work from base-class constructors. Calling init() at the end of your subclass constructor ensures registerTools() and getSystemPrompt() resolve to your overrides.Connecting MCP Servers
Register all tools exposed by an MCP server with a single call:AgentConfig Reference
All fields have sensible defaults. Override only what you need:| Field | Type | Default | Description |
|---|---|---|---|
baseUrl | std::string | "http://localhost:8000/api/v1" | LLM server endpoint (Lemonade Server recommended; other OpenAI-compatible servers untested) |
modelId | std::string | "Qwen3-4B-GGUF" | Model identifier sent to the server |
maxSteps | int | 20 | Maximum agent loop iterations per query |
maxPlanIterations | int | 3 | Maximum plan/replan cycles before forcing completion |
maxConsecutiveRepeats | int | 4 | Consecutive identical responses before loop-detection triggers |
maxHistoryMessages | int | 40 | Max messages kept between queries (0 = unlimited) |
contextSize | int | 16384 | LLM context window size in tokens (n_ctx) |
debug | bool | false | Enable verbose debug logging to stdout |
showPrompts | bool | false | Print full system prompts and LLM responses |
silentMode | bool | false | Suppress all console output (use SilentConsole) |
streaming | bool | false | Enable streaming responses from the LLM (planned — not yet implemented) |
temperature | double | 0.7 | LLM sampling temperature (0.0 = deterministic, 1.0 = creative) |
Project Structure
Comparison with GAIA Python SDK
| Feature | Python | C++ |
|---|---|---|
| Agent loop (plan, tool, answer) | Yes | Yes |
| Tool registration | Yes | Yes |
| MCP client (stdio) | Yes | Yes |
| JSON parsing with fallbacks | Yes | Yes |
| OpenAI-compatible LLM backend | Yes | Yes |
| Multiple LLM providers (Claude, OpenAI) | Yes | Planned |
| Specialized agents (Code, Docker, Jira) | Yes | — |
| REST API server | Yes | — |
| Audio / RAG / Stable Diffusion | Yes | — |
Next Steps
Setup
Install CMake, a C++17 compiler, Git, and Lemonade Server
Quickstart
Build steps and running your first demo agent
API Reference
Error handling, thread safety, security, deployment, and complete class API
Integration Guide
Consume gaia_core in your own CMake project via FetchContent, find_package, or shared library
Custom Agent
Custom prompts, typed tools, MCP servers, output capture, and AgentConfig tuning