training: fixed EarlyStopping._calc_new_threshold to account for negative scores#1065
training: fixed EarlyStopping._calc_new_threshold to account for negative scores#1065MaxBalmus wants to merge 1 commit intoskorch-dev:masterfrom
Conversation
|
Thanks for creating this PR. Just to ensure that I got things right: Let's assume we have a negative score, say, -10, and a threshold of 0.1. Then, with the current code, we would have:
Then, for Ideally, we could have a unit test for this. I would need to be a bit tricky. Maybe we can create a custom scoring function that just returns from a list of predefined negative scores, and set the |
|
Yes, you described the issue exactly as I see it. Regarding the test: I am not as familiar with code, but I think I follow what you mean and I think it should work. |
Great, thanks for confirming.
Would you be interested in giving it a try? The tests reside here: If not, it's fine too, just let me know. |
|
Sure, I will give it a go. |
|
@MaxBalmus gentle ping :) |
|
Hi @BenjaminBossan - apologies for disappearing. Unfortunately, I couldn't find time to write the test, and with my current workload I find it a bit difficult to do it. |
This fix is meant to account for training models with potentially negative scores as we see in the case of cost functions based on the log likelihood. The current fix ensures that
abs_threshold_changeis always positive, guaranteeing the intended monotonicity of the threshold.