Skip to content

Conversation

@pbzona
Copy link

@pbzona pbzona commented Jan 19, 2026

The README and documentation consistently reference llama3.2:3b, but the default config was using llama3.2 without the tag. This caused model detection issues because Ollama requires exact model names including tags.

As a result, I kept falling back to heuristic based enrichment, even thought I was using the default config and had followed the exact setup commands in the README.

Changes:

  • Update Config.Default.enrichment.model: llama3.2 -> llama3.2:3b
  • Update Config.Default.judge.model: llama3.2 -> llama3.2:3b

This aligns the code default with the documented behavior and AutoTagger.DEFAULT_MODELS configuration.

Summary by CodeRabbit

  • Chores
    • Default configuration values have been updated to utilize more specific and precise model version identifiers. The default AI models are now configured to use version "llama3.2:3b", ensuring improved consistency in results, enhanced compatibility with system resources, and more predictable behavior throughout AI-assisted operations.

✏️ Tip: You can customize this high-level summary in your review settings.

The README and documentation consistently reference llama3.2:3b,
but the default config was using llama3.2 without the tag.
This caused model detection issues because Ollama requires exact
model names including tags.

Changes:
- Update Config.Default.enrichment.model: 'llama3.2' -> 'llama3.2:3b'
- Update Config.Default.judge.model: 'llama3.2' -> 'llama3.2:3b'

This aligns the code default with the documented behavior and
AutoTagger.DEFAULT_MODELS configuration.
@coderabbitai
Copy link

coderabbitai bot commented Jan 19, 2026

📝 Walkthrough

Walkthrough

Default model configurations updated in types.ts. The enrichment and judge model settings changed from "llama3.2" to "llama3.2:3b" to specify a more precise model version identifier.

Changes

Cohort / File(s) Summary
Default model version updates
src/types.ts
Updated default values for enrichment.model and judge.model from "llama3.2" to "llama3.2:3b" to specify a more specific model variant

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~3 minutes

Poem

🐰 A tweak so small, a version refined,
From generic to specific, the models now aligned,
llama3.2:3b now takes the stage,
Precision in config, turn the page! ✨

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately summarizes the main change: adding the :3b tag to default llama3.2 model configuration values in src/types.ts.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant