Skip to content

Conversation

@chmjkb
Copy link
Collaborator

@chmjkb chmjkb commented Feb 3, 2026

Description

Currently, there is no way to access the prompt tokens count, which can be useful sometimes. This PR adds two methods to the LLM API, exposing a way to access the mentioned stats.

Introduces a breaking change?

  • Yes
  • No

Type of change

  • Bug fix (change which fixes an issue)
  • New feature (change which adds functionality)
  • Documentation update (improves or adds clarity to existing documentation)
  • Other (chores, tests, code style improvements etc.)

Tested on

  • iOS
  • Android

Testing instructions

Screenshots

Related issues

Checklist

  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have updated the documentation accordingly
  • My changes generate no new warnings

Additional notes

@chmjkb chmjkb marked this pull request as ready for review February 3, 2026 13:35
@msluszniak msluszniak added the feature PRs that implement a new feature label Feb 3, 2026
@msluszniak
Copy link
Member

msluszniak commented Feb 3, 2026

Personally, I would merge PR with documentation first, because half of this PR will need to be changed after the rebase.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

feature PRs that implement a new feature

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants