Skip to content

Conversation

@naincy128
Copy link
Contributor

This PR addresses a bug where excessively large system messages, typically caused by unit content with extensive video transcripts, result in Xpert message failures. In these scenarios, the system prompt consumes the entire token limit, leaving no room for learner or LA messages, thus breaking the interaction flow.

BRANCH: COSMOS2-14/naincy128

Resolution:

  1. Unit content is now handled using proportional trimming before sending to Xpert.
  2. This ensures the prompt includes a balanced amount of all content types, leaving enough room for both learner and LA messages.

Test Coverage:

  1. Confirmed that learner messages now reach Xpert successfully.
  2. Added unit tests to validate trimming behaviour for extremely large unit contents.

All tests and lint checks pass.

Jira ticket: https://2u-internal.atlassian.net/browse/COSMO2-14

@naincy128 naincy128 closed this Sep 6, 2025
@naincy128 naincy128 deleted the COSMOS2-14/naincy128 branch September 6, 2025 10:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant