Source Code:
cpp/CMakeLists.txt — build configuration with install rules, export targets, and FetchContent support.Prerequisites: Familiarity with CMake and C++17. See the C++ Framework Overview for build instructions and the
AgentConfig reference.Overview
gaia_core is designed to drop into any C++ project with minimal friction. The library is self-contained — all dependencies (nlohmann/json, cpp-httplib) are resolved automatically, so you never install or manage them by hand. Your project only needs CMake 3.14+ and a C++17 compiler.
The shortest path — add three lines to your CMakeLists.txt:
#include <gaia/agent.h>, the full agent loop, tool registry, MCP client, and JSON utilities — no manual installs, no system packages, no dependency conflicts.
Integration Methods
| Method | When to use |
|---|---|
| FetchContent | Default choice — no install step, works everywhere |
| Git submodule | You want the source in your repo for offline builds or pinned versions |
| find_package | You want a system-wide install or use a package manager |
| Shared library | You need a .so / .dll for plugin architectures |
- FetchContent (Recommended)
- Git Submodule
- find_package (Install)
The simplest approach — CMake downloads and builds That is it. All transitive dependencies (nlohmann/json, cpp-httplib) are fetched automatically. Your
gaia_core as part of your project. No install step, no system packages, no manual dependency management.CMakeLists.txt
SOURCE_SUBDIR cpp tells CMake to use only the cpp/ subdirectory of the GAIA repository. Your build tree contains only the C++ library and its dependencies.When consumed as a sub-project,
GAIA_BUILD_TESTS and GAIA_BUILD_EXAMPLES default to OFF, so you only get the library — no test binaries or demo executables in your build tree.main.cpp can #include <gaia/agent.h> and subclass gaia::Agent immediately.Subclassing Example
Here is a complete minimal agent that registers one tool and processes a query. This works with any of the three integration methods above.time_agent.cpp
CMakeLists.txt:
Using Alternative LLM Backends
The GAIA C++ agent framework is not tied to Lemonade or any specific LLM provider. It talks to a standard HTTP endpoint — any server that implements the OpenAI chat completions API works out of the box. Switch backends by changing two fields:What “OpenAI-Compatible” Means
The agent uses a single HTTP endpoint:POST {baseUrl}/chat/completions. It sends a standard request body and expects a standard response:
cfg.streaming = true), no fine-tuning API. Any server that handles this request/response format works — local or remote, open-source or commercial.
Local Inference Servers
- llama.cpp
- Ollama
- vLLM
- Lemonade (default)
llama.cpp includes a built-in server with OpenAI-compatible endpoints. This is the most direct way to run a GGUF model locally without any Python dependencies.
llama.cpp runs entirely in C++ — no Python, no pip. If you want a fully native stack (C++ agent + C++ inference), this is the combination to use.
Cloud and Remote Providers
You can also point the agent at cloud-hosted LLM services. Build with SSL support first:- OpenAI
- Azure OpenAI
- Any OpenAI-compatible API
Authorization header via the OPENAI_API_KEY environment variable, or modify the HTTP request in a custom Agent subclass.Model requirements: The agent needs a model that can produce structured JSON output with tool names and arguments. Most instruction-tuned models 4B+ work well. Smaller models (< 3B parameters) may struggle with the structured response format required for tool calling.
DLL Export Macros
All public classes and functions ingaia_core are annotated with the GAIA_API macro, which is generated automatically by CMake’s GenerateExportHeader. When building as a shared library:
- Windows (MSVC):
GAIA_APIexpands to__declspec(dllexport)when building the library, and__declspec(dllimport)when consuming it. - Linux:
GAIA_APIexpands to__attribute__((visibility("default"))). - Static library:
GAIA_APIexpands to nothing.
gaia::gaia_core CMake target sets all required compile definitions automatically.
Troubleshooting
FetchContent is slow on first build
FetchContent is slow on first build
The first configure downloads nlohmann/json, cpp-httplib, and (if tests are on) Google Test from GitHub. Subsequent builds use the CMake cache. To speed up repeated clean builds, consider using a local clone or a CMake dependency cache.
find_package cannot find gaia_core
find_package cannot find gaia_core
Ensure you ran
cmake --install and that the install prefix is in your CMAKE_PREFIX_PATH:Linker errors with nlohmann/json on find_package
Linker errors with nlohmann/json on find_package
nlohmann/json is a public dependency. When using find_package(gaia_core), the installed config file calls find_dependency(nlohmann_json). Make sure nlohmann/json is installed system-wide (see the install steps above).DLL crashes on Windows
DLL crashes on Windows
Verify that your consumer project uses the same MSVC version, platform toolset, and CRT runtime (
/MD vs /MT) as the gaia_core DLL. Mismatched runtimes corrupt the heap when STL objects cross the DLL boundary.Next Steps
C++ Framework Overview
Prerequisites, AgentConfig reference, and the full project structure
C++ Source Code
Browse the implementation on GitHub
Customizing Your Agent
Custom prompts, typed tools, MCP servers, output capture, and AgentConfig tuning
Wi-Fi Troubleshooter Agent
Full network diagnostic and auto-fix using registered C++ tools
MCP Client
How MCP client-server integration works in GAIA