Trying to run "meta-llama/Llama-3.2-1B", but getting an error msg saying:
ValueError: Model meta-llama/Llama-3.2-1B cannot be found in HuggingFace repositories, nor could an OpenAI model be initialized.
When running with the regular pipeline setup on huggingface (from transformers import pipeline), it gives the correct response "this is a gated model"
Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B is restricted. You must have access to it and be authenticated to access it. Please log in.
Versions:
python 3.10.12
transformers 4.40.2
stormtrooper 1.0.0
Trying to run "meta-llama/Llama-3.2-1B", but getting an error msg saying:
When running with the regular pipeline setup on huggingface (
from transformers import pipeline), it gives the correct response "this is a gated model"Versions:
python 3.10.12
transformers 4.40.2
stormtrooper 1.0.0