Skip to content

[bot] Add instrumentation for Ollama Go client (ollama/ollama/api) #68

@braintrust-bot

Description

@braintrust-bot

Summary

The Braintrust Go SDK instruments OpenAI, Anthropic, Google GenAI, sashabaranov/go-openai, LangChainGo, Firebase Genkit, Google ADK, and CloudWeGo Eino — but does not instrument the Ollama Go client (github.com/ollama/ollama/api). Ollama is the dominant local/self-hosted LLM runtime (169k+ GitHub stars), and its official Go client has meaningful execution APIs for chat, text generation, and embeddings.

No other Braintrust SDK (Python, TypeScript) has explicit Ollama client instrumentation either, and Ollama is not mentioned in Braintrust's integrations documentation, making this a completely uninstrumented library across the platform.

Note: While Ollama exposes an OpenAI-compatible HTTP endpoint (which would be covered by existing OpenAI instrumentation when accessed through an OpenAI client), many Go users use the native Ollama Go client directly since it is more idiomatic and provides access to Ollama-specific features (thinking/reasoning modes, model management). The native client is not covered by any existing instrumentation.

What is missing

A trace/contrib/ollama/ integration module that wraps execution calls on the Ollama Go client. The key execution surfaces are:

  • Chat — conversational model inference with streaming (via callback ChatResponseFunc), tool calling, and thinking/reasoning support
  • Generate — text generation with streaming (via callback GenerateResponseFunc)
  • Embed / Embeddings — vector embedding generation

The Ollama client is created via api.ClientFromEnvironment() or api.NewClient(base, http). Since it accepts a custom *http.Client, instrumentation could wrap the HTTP transport (similar to the GenAI integration pattern) or wrap the client methods directly.

Braintrust docs status

Ollama is not mentioned in any Braintrust documentation — not as a proxy provider, not as an SDK integration, and not in the integrations directory. Status: not_found.

Upstream sources

Braintrust docs sources

Local repo files inspected

  • go.mod — no ollama/ollama dependency
  • trace/contrib/ — no ollama/ directory exists
  • trace/contrib/all/all.go — no Ollama import
  • examples/ — no Ollama example
  • trace/contrib/genai/tracegenai.go — reference pattern for HTTP client wrapping approach (Ollama client also accepts custom *http.Client)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions