- C++ Compiler: GCC 14+, Clang 18+, or MSVC 2022+ with C++23 support
- Build System: xmake 3.0.0+
- Dependencies:
mbedtlsis resolved automatically by xmake
Add to your xmake.lua:
add_repositories("mcpplibs-index git@github.com:mcpplibs/mcpplibs-index.git")
add_requires("llmapi 0.1.0")
target("myapp")
set_kind("binary")
set_languages("c++23")
set_policy("build.c++.modules", true)
add_files("src/*.cpp")
add_packages("llmapi")git clone https://github.com/mcpplibs/llmapi.git
cd llmapi
xmake
xmake run hello_mcpp
xmake run basic
xmake run chatCreate main.cpp:
import mcpplibs.llmapi;
import std;
int main() {
using namespace mcpplibs::llmapi;
auto apiKey = std::getenv("OPENAI_API_KEY");
if (!apiKey) {
std::cerr << "OPENAI_API_KEY not set\n";
return 1;
}
auto client = Client(Config{
.apiKey = apiKey,
.model = "gpt-4o-mini",
});
client.system("You are a helpful assistant.");
auto resp = client.chat("Hello, introduce yourself in one sentence.");
std::cout << resp.text() << '\n';
return 0;
}Build and run:
xmake
xmake run hello_mcppSet the provider-specific API key you plan to use:
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
export DEEPSEEK_API_KEY="..."OpenAI:
auto client = Client(Config{
.apiKey = std::getenv("OPENAI_API_KEY"),
.model = "gpt-4o-mini",
});Anthropic:
auto client = Client(anthropic::Anthropic({
.apiKey = std::getenv("ANTHROPIC_API_KEY"),
.model = "claude-sonnet-4-20250514",
}));Compatible endpoint through the OpenAI provider:
auto client = Client(Config{
.apiKey = std::getenv("DEEPSEEK_API_KEY"),
.baseUrl = std::string(URL::DeepSeek),
.model = "deepseek-chat",
});- C++ API Guide - Learn the full C++ API
- Examples - See more examples
- Providers - Configure different providers