Skip to content

Error loading dolly model locally #231

@ethanenguyen

Description

@ethanenguyen

Hello,

I'm trying to load a fine tuned dolly model locally behind a firewall and deepseepd-mii keeps looking up huggingface for the model. I tried to pass the model path to model but that still didn't work. Is there an example of mii.deploy for a local model? I looked through both the example folder and issues but couldn't find a working example.

Thanks,

mii_configs = {"tensor_parallel": 1,
               "dtype": "fp16",
               "port_number": 50950,
               "meta_tensor": True,
               "skip_model_check": True,
               # "trust_remote_code": True,
               }
name = "fine-tuned"

mii.deploy(task='text-generation',
           model='databricks/dolly-v2-7b',
           model_path="/local/dolly/trained_model/fine-tuned-dolly-v2-7b/",
           deployment_name=name + "_deployment",
           mii_config=mii_configs)

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions