Skip to content

0.5.0

Choose a tag to compare

@mattt mattt released this 11 Dec 12:23
· 51 commits to main since this release
Immutable release. Only release title and notes can be modified.
877ee49

What's Changed

  • Add custom generation options for language models by @mattt in #50
  • expose HubApi and model directory configuration for MLX by @tpae in #44
  • Fix non-streaming respond() does not add .response() entry to transcript by @noorbhatia in #46
  • Fix invalid construction of greedy sampler in LlamaLanguageModel by @mattt in #55
  • Add configuration option for llama.cpp logging by @mattt in #54
  • Update LlamaLanguageModel to replace implementation-specific properties with custom generation options by @mattt in #56
  • Implement chat templates for LlamaLanguageModel by @mattt in #57
  • Add logic to handle encoder-only llama models by @mattt in #53
  • Improve migration for deprecated llama model APIs by @mattt in #58
  • OpenAILanguageModel + SystemLanguageModel tools calling support by @airy10 in #49
  • Pass instructions as system prompt for MLXLanguageModel by @noorbhatia in #48

New Contributors

Full Changelog: 0.4.5...0.5.0