Add log_key_prefix parameter to LearningRateMonitor#21612
Merged
tchaton merged 4 commits intoLightning-AI:masterfrom Apr 1, 2026
Merged
Conversation
Allow users to prepend a configurable prefix to all metric names logged by `LearningRateMonitor`. This is useful for grouping learning rate metrics in loggers like TensorBoard (e.g., `optim/lr-Adam` instead of `lr-Adam`). Fixes Lightning-AI#21590
Codecov Report✅ All modified and coverable lines are covered by tests.
Additional details and impacted files@@ Coverage Diff @@
## master #21612 +/- ##
=========================================
- Coverage 87% 79% -8%
=========================================
Files 270 267 -3
Lines 23898 23843 -55
=========================================
- Hits 20678 18769 -1909
- Misses 3220 5074 +1854 |
deependujha
reviewed
Mar 30, 2026
deependujha
approved these changes
Mar 30, 2026
tchaton
approved these changes
Apr 1, 2026
bhimrazy
pushed a commit
to bhimrazy/pytorch-lightning
that referenced
this pull request
Apr 13, 2026
…#21612) * Add `log_key_prefix` parameter to `LearningRateMonitor` callback Allow users to prepend a configurable prefix to all metric names logged by `LearningRateMonitor`. This is useful for grouping learning rate metrics in loggers like TensorBoard (e.g., `optim/lr-Adam` instead of `lr-Adam`). Fixes Lightning-AI#21590 * Update src/lightning/pytorch/CHANGELOG.md --------- Co-authored-by: Deependu <deependujha21@gmail.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What does this PR do?
Adds a
log_key_prefixparameter to theLearningRateMonitorcallback, allowing users to prepend a configurable prefix to all logged metric names (learning rate, momentum, weight decay). This is useful for grouping learning rate metrics in loggers like TensorBoard.Example:
Fixes #21590
Changes
src/lightning/pytorch/callbacks/lr_monitor.py: Addedlog_key_prefixparameter to__init__and applied the prefix in_check_duplicates_and_update_nameso it flows consistently through all metric keys (self.lrs,self.last_momentum_values,self.last_weight_decay_values, and logged metrics)tests/tests_pytorch/callbacks/test_lr_monitor.py: Added 4 test cases covering prefix with single optimizer, with momentum/weight_decay, with multiple optimizers, and withNone(default behavior)src/lightning/pytorch/CHANGELOG.md: Added entry under[unreleased]Before submitting
LearningRateMonitor#21590, maintainer @justusschock invited a PR)PR review
Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:
Reviewer checklist
📚 Documentation preview 📚: https://pytorch-lightning--21612.org.readthedocs.build/en/21612/