Skip to content

Conversation

@creaumond
Copy link

@creaumond creaumond commented Jan 9, 2026

What this does

When a user passes in the assume_model_exists flag through a custom Rails model, the flag gets dropped and they are not able to utilize a custom model.

This persists that flag or sets it to false if not provided.

Type of change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation
  • Performance improvement

Scope check

  • I read the Contributing Guide
  • This aligns with RubyLLM's focus on LLM communication
  • This isn't application-specific logic that belongs in user code
  • This benefits most users, not just my specific use case

Quality check

  • I ran overcommit --install and all hooks pass
  • I tested my changes thoroughly
    • For provider changes: Re-recorded VCR cassettes with bundle exec rake vcr:record[provider_name]
    • All tests pass: bundle exec rspec
  • I updated documentation if needed
  • I didn't modify auto-generated files manually (models.json, aliases.json)

API changes

  • Breaking change
  • New public methods/classes
  • Changed method signatures
  • No API changes

Related issues

#555

schoblaska added a commit to schoblaska/ruby_llm_test that referenced this pull request Jan 12, 2026
@schoblaska
Copy link

Confirmed that this fix works on my test repo. Thanks, @creaumond !

ruby_llm_test $ OPENROUTER_API_KEY=sk-or-v1-asdf1234 bin/rake test_ruby_llm
Testing RubyLLM.chat with unknown model
Hi there! How can I help you today?

Testing Llm::Chat with known model
Hi there! How can I help you today?

Testing Llm::Chat with unknown model
Hi there! How can I help you today?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants