Skip to content

Bug - BroswerOs AI Ollama custom model setting #369

@berkode

Description

@berkode

Issue Type

Agent Issue

Operating System

macOS

Description of the bug

The custom model setting for Ollama in BrowserOs AI does not work correctly.
I have set kimi2.5 as Ollama model.
yesterday it was saying the model was gemini.
today tested again and it think it it Claude...

Steps to Reproduce

Navigate to settings
Set BrowserOs AI LLM provider
Select Ollama
Set custom model
Enter kimi or else for model
Open BroserOs AI panel
Ask which model it is - THINKS IT IS ANOTHER MODEL THAN THE ONE THAT HAD BEEN SET.
- Custom model setting is not reflected correctly.
Switch BrowserOs AI LLM provider to another one like Gemini this time.
Open BroserOs AI panel
Ask which model it is. - THINKS IT IS ANOTHER MODEL THAN THE ONE THAT HAD BEEN SET.
- Switching the model does not work either.

Screenshots / Videos

Image

BrowserOS Version

0.40.1.0

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions