Skip to content

Bug: Local LLMs that don't support tool calling cause backend crash (e.g., deepseek-r1) #45

@androemeda

Description

@androemeda

Problem Description

The SETUP.md documentation recommends deepseek-r1:8b and deepseek-r1:1.5b as local chat models for Ollama. However, these models do not currently support native tool/function calling via the Ollama API.

Because the backend relies on tool calling to interact with the VCell API and search the knowledge base, using these models causes a hard crash (400 Bad Request) the moment a user attempts to send a chat message.


Steps to Reproduce

  1. Set up the local backend with Ollama as per SETUP.md.

  2. Pull and configure one of the originally recommended models in .env:

PROVIDER=local
AZURE_DEPLOYMENT_NAME=deepseek-r1:1.5b
  1. Start the backend and frontend.

  2. Send any message in the chat interface (e.g., "List all Calcium models").


Expected Behavior

The backend should either:

  • Process the query successfully using tools, or
  • If the model does not support tools, gracefully fall back to a mechanism that allows the user to continue chatting, even if with limited capabilities, instead of crashing.

Actual Behavior

The backend fails with an unhandled exception and returns a 500 Internal Server Error to the frontend.

Backend logs show an error similar to:

Error code: 400 - {
  "error": {
    "message": "registry.ollama.ai/library/deepseek-r1:1.5b does not support tools",
    "type": "api_error"
  }
}
Image

Proposed Fix

Refactor llms_service.py to add a graceful fallback mechanism.

Implementation Outline

  • Wrap the native tool calling API in a try...except block.
  • If the model returns a 400 error indicating that tools are not supported:
    • Log a warning
    • Fall back to a prompt-based tool selection approach.

Fallback Workflow

  1. Ask the LLM to output a structured JSON indicating which tool to call.
  2. Execute the selected tool.
  3. Ask the LLM to generate a final response using the tool output.

Code Organization

Move the newly created TOOL_SELECTION_PROMPT into a dedicated file:

app/utils/tool_selection_prompt.py

to follow existing project patterns.

Documentation Update

Update SETUP.md recommendations:

  • Change the primary recommendation to models that natively support tool calling (e.g., llama3.1:8b).
  • List deepseek-r1 models as alternative options, clearly noting:
    • Their limitation (no native tool calling)
    • That the backend will automatically use the fallback mechanism.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions