Open
Conversation
deep1401
requested changes
Apr 17, 2026
Member
deep1401
left a comment
There was a problem hiding this comment.
This fails for me. I got a A100 on runpod and ran and I see this error:
File "/usr/local/bin/distillkit", line 7, in <module>
sys.exit(main())
^^^^^^
File "/usr/local/lib/python3.12/dist-packages/click/core.py", line 1485, in __call__
return self.main(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/click/core.py", line 1406, in main
rv = self.invoke(ctx)
^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/click/core.py", line 1269, in invoke
return ctx.invoke(self.callback, **ctx.params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/click/core.py", line 824, in invoke
return callback(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.ssh/distillkit/distillkit/main.py", line 393, in main
do_distill(config)
File "/root/.ssh/distillkit/distillkit/main.py", line 306, in do_distill
tokenizer = load_tokenizer(config)
^^^^^^^^^^^^^^^^^^^^^^
File "/root/.ssh/distillkit/distillkit/main.py", line 288, in load_tokenizer
return transformers.AutoTokenizer.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/transformers/models/auto/tokenization_auto.py", line 693, in from_pretrained
tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/transformers/models/auto/tokenization_auto.py", line 530, in get_tokenizer_config
resolved_config_file = cached_file(
^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/transformers/utils/hub.py", line 278, in cached_file
file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/transformers/utils/hub.py", line 512, in cached_files
raise e
File "/usr/local/lib/python3.12/dist-packages/transformers/utils/hub.py", line 422, in cached_files
hf_hub_download(
File "/usr/local/lib/python3.12/dist-packages/huggingface_hub/utils/_validators.py", line 88, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/huggingface_hub/file_download.py", line 997, in hf_hub_download
return _hf_hub_download_to_cache_dir(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/huggingface_hub/file_download.py", line 1216, in _hf_hub_download_to_cache_dir
_download_to_tmp_and_move(
File "/usr/local/lib/python3.12/dist-packages/huggingface_hub/file_download.py", line 1828, in _download_to_tmp_and_move
with incomplete_path.open("ab") as f:
^^^^^^^^^^^^^^^^^^^^^^^^^^
OSError: [Errno 5] Input/output error
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Added a new task for DistillKit (https://github.com/arcee-ai/DistillKit), a flexible toolkit for knowledge distillation of large language models.
Changes
Features
Parameters
How to Test
Note: For local testing, ensure DistillKit is installed as per its docs, and have access to the specified HF datasets.