Skip to content

passing parameters to metrics#1783

Draft
s6sebusc wants to merge 23 commits intoecmwf:developfrom
s6sebusc:metric_parameters
Draft

passing parameters to metrics#1783
s6sebusc wants to merge 23 commits intoecmwf:developfrom
s6sebusc:metric_parameters

Conversation

@s6sebusc
Copy link
Contributor

@s6sebusc s6sebusc commented Feb 3, 2026

Description

Parameters for individual metrics are added in a new optional entry metric_parameters in the eval config, for example

evaluation:
  metrics  : ["fbi", "rmse"]
  metric_parameters:
    fbi:
      thresh: 0.001
  regions: ["nhem"]
  summary_plots : true
... 

When a metric is loaded or computed, corresponding parameters are taken from metric_parameters.

New JSON files for scores now have a top level key "scores", below which is a list of results for the same score with different parameter settings, with each list entry following the same format as before. Old files with no "scores" key are treated appropriately.

Upon loading a score, the reader goes through all elements of the JSON file in search of the right parameter settings. When saving scores, metric_list_to_json now checks whether the output file is already present. If it is, it goes through the file in search of the right parameters settings. If it finds them (possible when for example fsteps was changed), it replaces that entry with the new values. If the right parameters are not found, the new results are appended at the end of the JSON file.

Issue Number

Closes #1475

@s6sebusc
Copy link
Contributor Author

s6sebusc commented Feb 3, 2026

I have not yet looked at the map plots or mlflow.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: No status

Development

Successfully merging this pull request may close these issues.

2 participants