Skip to content

Gated models yield ValueError that states "model cannot be found" #5

@MinaAlmasi

Description

@MinaAlmasi

Trying to run "meta-llama/Llama-3.2-1B", but getting an error msg saying:

ValueError: Model meta-llama/Llama-3.2-1B cannot be found in HuggingFace repositories, nor could an OpenAI model be initialized.

When running with the regular pipeline setup on huggingface (from transformers import pipeline), it gives the correct response "this is a gated model"

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B is restricted. You must have access to it and be authenticated to access it. Please log in.

Versions:
python 3.10.12
transformers 4.40.2
stormtrooper 1.0.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions