Skip to content

Support top-level request parameters for OllamaLanguageModel (e.g. think) #152

@owenffff

Description

@owenffff

Ollama's /api/chat endpoint supports several top-level request body parameters that are siblings of model, messages, and stream — notably the think parameter for enabling/disabling extended thinking in reasoning models (QwQ, DeepSeek-R1, Gemma 3, etc.).

Currently, OllamaLanguageModel.CustomGenerationOptions ([String: JSONValue]) only flows into the nested "options" dictionary in the request body via convertOptions() / createChatParams(). There is no way to inject arbitrary top-level keys.

Current behavior

Custom options end up nested under "options", which is the incorrect location for parameters like think:

{ 
  "model": "...", 
  "messages": [...], 
  "options": { "think": false } 
}

Code usage:

var opts = GenerationOptions()
opts[custom: OllamaLanguageModel.self] = ["think": .bool(false)]

Expected behavior

A way to set top-level parameters like think:

{ 
  "model": "...", 
  "messages": [...], 
  "think": false 
}

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions