Skip to content

Bug: Cannot Parse Response From deepseek-r1:latest Correctly in JetBrains #10292

@zzyu17

Description

@zzyu17

Before submitting your bug report

Relevant environment info

- OS: Windows 10 22H2
- Server OS: MacOS 15.3 (24D2059)
- Continue version: 1.0.59
- IDE version: PyCharm 2025.3.1.1
- Ollama version: 0.15.5
- Model: deepseek-r1:latest
- config:
  
name: Local Assistant
version: 1.0.0
schema: v1
models:
  - name: deepseek-r1:latest
    provider: ollama
    model: deepseek-r1:latest
    apiBase: http://[server-ip]:11434
    roles:
      - chat
      - edit
      - apply
      - rerank
      - autocomplete
context:
  - provider: code
  - provider: docs
  - provider: diff
  - provider: terminal
  - provider: problems
  - provider: folder
  - provider: codebase

Description

When using chat mode in Pycharm with deepseek-r1:latest, deployed on ollama on a remote MacOS server, Continue fails to parse the response correctly and return only the thinking contents.
Image
How should this be fixed?

To reproduce

No response

Log output

Metadata

Metadata

Assignees

No one assigned

    Labels

    area:chatRelates to chat interfaceide:jetbrainsRelates specifically to JetBrains extensionkind:bugIndicates an unexpected problem or unintended behavioros:windowsHappening specifically on Windows

    Type

    No type

    Projects

    Status

    Todo

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions