-
Notifications
You must be signed in to change notification settings - Fork 142
Open
Description
Hi,
Is there something wrong with using it this way? I can't run the gguf models I want to try.
from ctransformers import AutoModelForCausalLM
model_name = "SanctumAI/Llama-3.2-3B-Instruct-GGUF"
gguf_file = "llama-3.2-3b-instruct.Q2_K.gguf"
llm = AutoModelForCausalLM.from_pretrained(
model_name,
model_file=gguf_file,
model_type="gguf")
Error output:
, line 8, in <module>
llm = AutoModelForCausalLM.from_pretrained(
File "/usr/local/lib/python3.10/dist-packages/ctransformers/hub.py", line 175, in from_pretrained
llm = LLM(
File "/usr/local/lib/python3.10/dist-packages/ctransformers/llm.py", line 253, in __init__
raise RuntimeError(
RuntimeError: Failed to create LLM 'gguf' from '/root/.cache/huggingface/hub/models--SanctumAI--Llama-3.2-3B-Instruct-GGUF/blobs/c77eb142ab869944f388ff093fc7276ea15c4e1f810ceb76554fc5ae77694c19'.
Metadata
Metadata
Assignees
Labels
No labels