-
Notifications
You must be signed in to change notification settings - Fork 23
Cbe assessment criteria versioning adr #476
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
mgwozdz-unicon
wants to merge
6
commits into
openedx:main
Choose a base branch
from
mgwozdz-unicon:cbe_assessment_criteria_versioning_adr
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
6 commits
Select commit
Hold shift + click to select a range
743034b
docs: add ADR 0020 assessment criteria location
mgwozdz-unicon cc0ec47
docs: Increment ADR number
mgwozdz-unicon b844049
docs: add ADR 0021 assessment criteria versioning
mgwozdz-unicon 236ef76
docs: Increment ADR number
mgwozdz-unicon 1100d8a
docs: remove location adr from versioning adr branch
mgwozdz-unicon 91697af
docs: Use django-simple-history for versioning
mgwozdz-unicon File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,54 @@ | ||
| 23. How should versioning be handled for CBE competency achievement criteria? | ||
| ================================================================= | ||
|
|
||
| Context | ||
| ------- | ||
| Course Authors and/or Platform Administrators will be entering the competency achievement criteria rules in Studio that learners are required to meet in order to demonstrate competencies. Depending on the institution, these Course Authors or Platform Administrators may have a variety of job titles, including Instructional Designer, Curriculum Designer, Instructor, LMS Administrator, Faculty, or other Staff. | ||
|
|
||
| Typically, only one person would be responsible for entering competency achievement criteria rules in Studio for each course, though this person may change over time. However, entire programs could have many different Course Authors or Platform Administrators with this responsibility. | ||
|
|
||
| Typically, institutions and instructional designers do not change the mastery requirements (competency achievement criteria) for their competencies frequently over time. However, the ability to do historical audit logging of changes within Studio can be a valuable feature to those who have mistakenly made changes and want to revert or those who want to experiment with new approaches. | ||
|
|
||
| Currently, Open edX always displays the latest edited version of content in the Studio UI and always shows the latest published version of content in the LMS UI, despite having more robust version tracking on the backend (Publishable Entities). Publishable Entities for Libraries is currently inefficient for large nested structures because all children are copied any time an update is made to a parent. | ||
|
|
||
| Authoring data (criteria definitions) and runtime learner data (status) have different governance needs. The former is long-lived and typically non-PII, while the latter is user-specific, can be large (learners x criteria/competencies x time), and may require stricter retention and access controls. These differing lifecycles can make deep coupling of authoring and runtime data harder to manage at scale. Performance is also a consideration as computing or resolving versioned criteria for large courses could add overhead in Studio authoring screens or LMS views. | ||
|
|
||
| Decision | ||
| -------- | ||
| The django-simple-history package will be used to track changes to competency achievement criteria definitions in Studio, providing an audit log of changes and the ability to revert to previous versions if needed. This approach allows us to capture the history of changes without the overhead of implementing a full publishable versioning framework for this initial implementation. The LMS will continue to display the latest published version of the criteria, consistent with current behavior, while Studio will show the latest edited version. This decision balances the need for version tracking and auditability with the desire to keep the initial implementation lightweight and performant, while still allowing for future enhancements to versioning if needed based on user feedback and evolving requirements. | ||
|
|
||
| Rejected Alternatives | ||
| --------------------- | ||
|
|
||
| 1. Defer competency achievement criteria versioning for the initial implementation. Store only the latest authored criteria and expose the latest published state in the LMS, consistent with current Studio/LMS behavior. | ||
| - Pros: | ||
| - Keeps the initial implementation lightweight | ||
| - Avoids the publishable framework's known inefficiencies for large nested structures | ||
| - Cons: | ||
| - There is no built-in rollback or audit history | ||
| - Adding versioning later will require data migration and careful choices about draft vs published defaults | ||
| 2. Each model indicates version, status, and audit fields | ||
| - Pros: | ||
| - Simple and familiar pattern (version + status + created/updated metadata) | ||
| - Straightforward queries for the current published state | ||
| - Can support rollback by marking an earlier version as published | ||
| - Stable identifiers (original_ids) can anchor versions and ease potential future migrations | ||
| - Cons: | ||
| - Requires custom conventions for versioning across related tables and nested groups | ||
| - Lacks shared draft/publish APIs and immutable version objects that other authoring apps can reuse | ||
| - Not necessarily consistent with existing patterns in the codebase (though these are already not overly consistent). | ||
| 3. Publishable framework in openedx-learning | ||
| - Pros: | ||
| - First-class draft/published semantics with immutable historical versions | ||
| - Consistent APIs and patterns shared across other authoring apps | ||
| - Cons: | ||
| - Inefficient for large nested structures because all children are copied for each new parent version | ||
| - Requires modeling criteria/groups as publishable entities and wiring Studio/LMS workflows to versioning APIs | ||
| - Adds schema and migration complexity for a feature that does not yet require full versioning | ||
| 4. Append-only audit log table (event history) | ||
| - Pros: | ||
| - Lightweight way to capture who changed what and when | ||
| - Enables basic rollback by replaying or reversing events | ||
| - Cons: | ||
| - Requires custom tooling to reconstruct past versions | ||
| - Does not align with existing publishable versioning patterns | ||
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FWIW, one of the issues that comes up around grading from time to time is the principle that someone should keep the grade that they saw themselves get, regardless of what changes the course team makes, unless that change was necessary to correct a bug (and often not even then). So the equivalent in competencies might be to make sure that by default we don't remove someone's competency because an author made some changes to how those competencies are earned.
Is it safe to assume that just preserving the timestamp they earned the competency would be sufficient to later reconstruct the state of the how the assessment criteria/grouping was defined at that point?