Skip to content

v5.4.0 breaks PretrainedConfig type checking #45071

@fynnsu

Description

@fynnsu

System Info

transformers 5.4.0
mypy 1.19.1
python 3.10

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

# transformers_typing.py
from transformers import LlamaConfig
llama_config = LlamaConfig(vocab_size=32000)

Run mypy type checker

mypy transformers_typing.py

Output:

transformers_typing.py:3: error: Unexpected keyword argument "vocab_size" for "LlamaConfig"  [call-arg]
Found 1 error in 1 file (checked 1 source file)

Note: It's not just vocab_size that's missing. As far as I can tell, this fails with all config attributes.

Expected behavior

Type checking passes. Note: this is the case for transformers<5.4.0.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions