Add a logic to support max_batch_size_dpo.#305
Conversation
src/together/types/finetune.py
Outdated
|
|
||
| class FinetuneFullTrainingLimits(BaseModel): | ||
| max_batch_size: int | ||
| max_batch_size_dpo: int |
There was a problem hiding this comment.
What do you think to make this optional? So we can indenedently publish it?
There was a problem hiding this comment.
You mean making it optional only on the python client side?
I don't understand how this would work.
The API have to return the max_batch_size_dpo in the limits, because it is a required parameter for each model.
Client without this parameter will throw errors I think. I can make it optional for the client, and estimate it if it is not provided by the API, but I think this case is not important because I'm going to update the API at the same time as client.
Do you have any other ideas?
The best solution is to make versioning, but we are far from it.
timofeev1995
left a comment
There was a problem hiding this comment.
thanks! lgtm
if we decide to deploy api first, if changes are made (making the parameter optional) I'd take another look.
|
|
||
| class FinetuneFullTrainingLimits(BaseModel): | ||
| max_batch_size: int | ||
| max_batch_size_dpo: int = -1 |
There was a problem hiding this comment.
(optional)
I prefer to have int | None rather than == -1, but it still works
There was a problem hiding this comment.
A lot of problems with type check in our code this way
Have you read the Contributing Guidelines?
Issue #
Describe your changes
Clearly and concisely describe what's in this pull request. Include screenshots, if necessary.